From ncoghlan at gmail.com  Sun May  1 14:24:57 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun May  1 14:25:07 2005
Subject: [Python-Dev] Keyword for block statements
In-Reply-To: <Pine.LNX.4.58.0504300227440.4786@server1.LFW.org>
References: <ca471dc205042916194f20c27@mail.gmail.com>	<d4tsik$4be$1@sea.gmane.org>	<ca471dc205042916194f20c27@mail.gmail.com>	<5.1.1.6.0.20050429213620.0322cec0@mail.telecommunity.com>
	<Pine.LNX.4.58.0504300227440.4786@server1.LFW.org>
Message-ID: <4274CA99.6010006@gmail.com>

Ka-Ping Yee wrote:
> The programmer who writes the function used to introduce a block
> can hardly be relied upon to explain the language semantics.  We
> don't expect the docstring of every class to repeat an explanation
> of Python classes, for example.  The language reference manual is
> for that; it's a different level of documentation.

Would 'suite' work as the keyword?

Calling these things 'suite' statements would match the Python grammar, give an 
obvious visual indicator through the use of a keyword, reduce any confusion 
resulting from the differences between Python suites and Ruby blocks (since the 
names would now be different), and avoid confusion due to the multiple meanings 
of the word 'block'.

And really, what PEP 340 creates is the ability to have user-defined suites to 
complement the standard control structures.

Anyway, here's the examples from the PEP using 'suite' as the keyword:

         suite synchronized(myLock):
             # Code here executes with myLock held.  The lock is
             # guaranteed to be released when the block is left (even
             # if by an uncaught exception).

         suite opening("/etc/passwd") as f:
             for line in f:
                 print line.rstrip()

         suite transactional(db):
             # Perform database operation inside transaction

         suite auto_retry(3, IOError):
             f = urllib.urlopen("http://python.org/peps/pep-0340.html")
	    print f.read()

         suite synchronized_opening("/etc/passwd", myLock) as f:
             for line in f:
                 print line.rstrip()

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan@gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net
From ncoghlan at iinet.net.au  Sun May  1 15:02:50 2005
From: ncoghlan at iinet.net.au (Nick Coghlan)
Date: Sun May  1 15:02:55 2005
Subject: [Python-Dev] PEP 340: Else clause for block statements
Message-ID: <4274D37A.4020007@iinet.net.au>

As yet, I don't have a particularly firm opinion on whether or not block 
statements should support an 'else:' clause. And there are obviously a great 
many other questions to be answered about how block statements might work that 
are more important than this one.

Still, I've been tinkering with some ideas for how to approach this, and thought 
I'd write them up for everyone else's consideration.

Option 0:
    No else clause allowed. Figured I should mention this, since it is Guido's 
last reported inclination, and my total lack of use cases for the other options 
below suggests this is the best idea for an initial implementation.

Option 1: mimic try, for, while semantics
    An 'else' clause on a block statement behaves like the else clause on for 
and while loops, and on try/except statements - the clause is executed only if 
the managed suite completes 'normally' (i.e. it is not terminated early due to 
an exception, a break statement or a return statement)

Option 2: mimic if semantics
   An 'else' clause on a block statement behaves vaguely like the else clause on 
an if statement - the clause is executed only if the first suite is never 
entered, but no exception occurs (i.e. StopIteration is raised by the first call 
to next).

Option 3: iterator-controlled semantics
   The iterator is given the ability to control whether or not the else clause 
is executed (e.g. via an attribute of StopIteration), probably using option 1 
above as the default behaviour.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan@gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net
From aahz at pythoncraft.com  Sun May  1 15:43:34 2005
From: aahz at pythoncraft.com (Aahz)
Date: Sun May  1 15:43:37 2005
Subject: [Python-Dev] PEP 340: Else clause for block statements
In-Reply-To: <4274D37A.4020007@iinet.net.au>
References: <4274D37A.4020007@iinet.net.au>
Message-ID: <20050501134334.GA24100@panix.com>

On Sun, May 01, 2005, Nick Coghlan wrote:
>
> Option 0:
>    No else clause allowed. Figured I should mention this, since it is 
>    Guido's last reported inclination, and my total lack of use cases for the 
> other options below suggests this is the best idea for an initial 
> implementation.

+1

> Option 1: mimic try, for, while semantics
>    An 'else' clause on a block statement behaves like the else clause on 
>    for and while loops, and on try/except statements - the clause is executed 
> only if the managed suite completes 'normally' (i.e. it is not terminated 
> early due to an exception, a break statement or a return statement)

+0

> Option 2: mimic if semantics
>   An 'else' clause on a block statement behaves vaguely like the else 
>   clause on an if statement - the clause is executed only if the first suite 
> is never entered, but no exception occurs (i.e. StopIteration is raised by 
> the first call to next).

-0

> Option 3: iterator-controlled semantics
>   The iterator is given the ability to control whether or not the else 
>   clause is executed (e.g. via an attribute of StopIteration), probably using 
> option 1 above as the default behaviour.

-1

Did you deliberately sort the options this way?  ;-)  I'm mainly
responding to deliver my vote against option 3; I don't care much about
the other possibilities.
-- 
Aahz (aahz@pythoncraft.com)           <*>         http://www.pythoncraft.com/

"It's 106 miles to Chicago.  We have a full tank of gas, a half-pack of
cigarettes, it's dark, and we're wearing sunglasses."  "Hit it."
From gvanrossum at gmail.com  Mon May  2 02:25:47 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Sun, 1 May 2005 17:25:47 -0700
Subject: [Python-Dev] Keyword for block statements
In-Reply-To: <4274CA99.6010006@gmail.com>
References: <d4tsik$4be$1@sea.gmane.org>
	<ca471dc205042916194f20c27@mail.gmail.com>
	<5.1.1.6.0.20050429213620.0322cec0@mail.telecommunity.com>
	<Pine.LNX.4.58.0504300227440.4786@server1.LFW.org>
	<4274CA99.6010006@gmail.com>
Message-ID: <ca471dc2050501172563f5ab96@mail.gmail.com>

[Nick Coghlan]
> Would 'suite' work as the keyword?
> 
> Calling these things 'suite' statements would match the Python grammar,

Actually that's an argument *against* -- too confusing to have two
things we call suite.

> give an
> obvious visual indicator through the use of a keyword, reduce any confusion
> resulting from the differences between Python suites and Ruby blocks (since the
> names would now be different),

There's no need for that; they are close enough most of the time any way.

> and avoid confusion due to the multiple meanings
> of the word 'block'.

Actually, in Python that's always called a suite, not a block. (Though
the reference manual defines "code block" as a compilation unit.)

> And really, what PEP 340 creates is the ability to have user-defined suites to
> complement the standard control structures.

Give that suite and block are so close in "intuitive" meaning, if
there were no convincing argument for either, I still like "block"
much better -- perhaps because suite is the technical term used all
over the grammar.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Mon May  2 02:44:16 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Sun, 1 May 2005 17:44:16 -0700
Subject: [Python-Dev] PEP 340: Else clause for block statements
In-Reply-To: <4274D37A.4020007@iinet.net.au>
References: <4274D37A.4020007@iinet.net.au>
Message-ID: <ca471dc205050117447870ef99@mail.gmail.com>

[Nick Coghlan]
> As yet, I don't have a particularly firm opinion on whether or not block
> statements should support an 'else:' clause. And there are obviously a great
> many other questions to be answered about how block statements might work that
> are more important than this one.
> 
> Still, I've been tinkering with some ideas for how to approach this, and thought
> I'd write them up for everyone else's consideration.
> 
> Option 0:
>     No else clause allowed. Figured I should mention this, since it is Guido's
> last reported inclination, and my total lack of use cases for the other options
> below suggests this is the best idea for an initial implementation.

The more I think about it the more this makes the most sense because
it is the easiest to understand.

> Option 1: mimic try, for, while semantics
>     An 'else' clause on a block statement behaves like the else clause on for
> and while loops, and on try/except statements - the clause is executed only if
> the managed suite completes 'normally' (i.e. it is not terminated early due to
> an exception, a break statement or a return statement)

You'd have to define this very carefully. Because break is implemented
by passing StopIteration to the __next__ or __exit__ method (depending
on which alternative API we end up picking), and StopIteration is also
how these methods signal that the loop is over, it's a little tricky.
Assuming we go with __exit__ to pass an exception to the
iterator/generator, we could define that the else clause is executed
when the __next__ method raises StopIteration -- this would imply
exhaustion of the iterator from natural causes. This has the advantage
of matching the behavior of a for loop.

> Option 2: mimic if semantics
>    An 'else' clause on a block statement behaves vaguely like the else clause on
> an if statement - the clause is executed only if the first suite is never
> entered, but no exception occurs (i.e. StopIteration is raised by the first call
> to next).

Strange because it's different from the behavior of a for loop, and
the block-statement doesn't feel like an if-statement at all. But I
could actually imagine a use case: when acquiring a lock with a
time-out, the else-clause could be executed when the acquisition times
out.

  block locking(myLock, timeout=30):
      ...code executed with lock held...
  else:
      ...code executed if lock not acquired...

But I'm not convinced that this shouldn't be handled with a try/except
around it all; the use case doesn't appear all that common, and it
scares me that when the lock isn't aquired, this happens entirely
silently when there is no else-clause.

> Option 3: iterator-controlled semantics
>    The iterator is given the ability to control whether or not the else clause
> is executed (e.g. via an attribute of StopIteration), probably using option 1
> above as the default behaviour.

A slightly cleaner version would be to have a separate subclass of
StopIteration for this purpose. But I see serious problems with
explaining when the else-clause is executed, because it's too dynamic.
It does solve one problem with option 2 though: if there's no
else-clause, and ElseIteration is raised, that could become an error
rather than being ignored silently.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From greg.ewing at canterbury.ac.nz  Mon May  2 05:02:03 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 02 May 2005 15:02:03 +1200
Subject: [Python-Dev] PEP 340: Else clause for block statements
In-Reply-To: <4274D37A.4020007@iinet.net.au>
References: <4274D37A.4020007@iinet.net.au>
Message-ID: <4275982B.5060402@canterbury.ac.nz>

Nick Coghlan wrote:

> Option 1: mimic try, for, while semantics
>    An 'else' clause on a block statement behaves like the else clause on 
> for and while loops, and on try/except statements - the clause is 
> executed only if the managed suite completes 'normally' (i.e. it is not 
> terminated early due to an exception, a break statement or a return 
> statement)

I've always thought that was a particularly unintuitive use
of the word 'else', and I'm not sure I'd like it to be
extended to any new statements.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From gvanrossum at gmail.com  Mon May  2 05:42:35 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Sun, 1 May 2005 20:42:35 -0700
Subject: [Python-Dev] PEP 340 - possible new name for block-statement
In-Reply-To: <5.1.1.6.0.20050429212046.032abd70@mail.telecommunity.com>
References: <ca471dc205042815557616722b@mail.gmail.com>
	<4271F71B.8010000@gmail.com> <20050429163854.GB14920@panix.com>
	<5.1.1.6.0.20050429130113.033208b0@mail.telecommunity.com>
	<ca471dc205042910162befaaee@mail.gmail.com>
	<5.1.1.6.0.20050429134751.03099cb0@mail.telecommunity.com>
	<5.1.1.6.0.20050429170733.031a74e0@mail.telecommunity.com>
	<20050429224300.GA9425@panix.com>
	<ca471dc205042916024c03501a@mail.gmail.com>
	<5.1.1.6.0.20050429212046.032abd70@mail.telecommunity.com>
Message-ID: <ca471dc205050120425332f1a9@mail.gmail.com>

[Phillip]
> By the way, I notice PEP 340 has two outstanding items with my name on
> them; let me see if I can help eliminate one real quick.
> 
> Tracebacks: it occurs to me that I may have unintentionally given the
> impression that I need to pass in an arbitrary traceback, when in fact I
> only need to pass in the current sys.exc_info().

I've updated the PEP (tying a couple of loose ends and making the
promised change to the new API); I've decided to change the signature
of __exit__() to be a triple matching the return value of
sys.exc_info(), IOW the same as the "signature" of the
raise-statement.

There are still a few loose ends left, including the alternative API
that you've proposed (which I'm not super keen on, to be sure, but
which is still open for consideration).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From rijishvr at rediffmail.com  Mon May  2 09:46:12 2005
From: rijishvr at rediffmail.com (rijish valoorthodi rajan)
Date: 2 May 2005 07:46:12 -0000
Subject: [Python-Dev] (no subject)
Message-ID: <20050502074612.15892.qmail@webmail36.rediffmail.com>

  
hello all
I am a member of a team dedicated to make a client server database application and our main concern is the speed with which the system performs. we are very new to python. but after reading a lot of documents and consulting some experts we decided to work it out using PYTHON. we plan to make 2 programs one running in the client systems and one that run in server. can any one please help me by telling the thins that i should take care of while designing the project and what tools and what style we should adopt to make the program optimised.

regards
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.python.org/pipermail/python-dev/attachments/20050502/b690e7ba/attachment.htm

From ajm at flonidan.dk  Mon May  2 12:03:06 2005
From: ajm at flonidan.dk (Anders J. Munch)
Date: Mon, 2 May 2005 12:03:06 +0200 
Subject: [Python-Dev] PEP 340: Else clause for block statements
Message-ID: <9B1795C95533CA46A83BA1EAD4B01030031EE8@flonidanmail.flonidan.net>

GvR wrote:
> [Nick Coghlan]
> > Option 2: mimic if semantics
> >    An 'else' clause on a block statement behaves vaguely like the else
clause on
> > an if statement - the clause is executed only if the first suite is
never
> > entered, but no exception occurs (i.e. StopIteration is raised by the
first call
> > to next).
> 
> Strange because it's different from the behavior of a for loop, and
> the block-statement doesn't feel like an if-statement at all. But I
> could actually imagine a use case: when acquiring a lock with a
> time-out, the else-clause could be executed when the acquisition times
> out.
> 
>   block locking(myLock, timeout=30):
>       ...code executed with lock held...
>   else:
>       ...code executed if lock not acquired...

A file-closing block function has the same need, as does any block
function that manages a resource, whose acquisition might fail.

A surrounding try/except doesn't quite cut it; the problem is that the
try-clause will then also cover the suite.

Example:

    try:
        in opening('file1') as f1:
            ...
            in opening('file2') as f2:
                ...
    except IOError:
        print "file1 not available, I'll try again later"

How do I tell try/except that I really only meant to trap
opening('file1'), but opening 'file2' is not supposed to fail so I
want any exception from that propagated?  Better if I could write:

    in opening('file1') as f1:
        ...
        in opening('file2') as f2:
            ...
    else:
        print "file1 not available, I'll try again later"

or even

    in opening('file1') as f1:
        ...
        in opening('file2') as f2:
            ...
    except IOError:
        print "file1 not available, I'll try again later"

I rather like this version, because it is patently clear what should
happen if there is no except-clause: The exception propagates
normally.

- Anders

From john at hazen.net  Mon May  2 12:32:53 2005
From: john at hazen.net (John Hazen)
Date: Mon, 2 May 2005 03:32:53 -0700
Subject: [Python-Dev] (no subject)
In-Reply-To: <20050502074612.15892.qmail@webmail36.rediffmail.com>
References: <20050502074612.15892.qmail@webmail36.rediffmail.com>
Message-ID: <20050502103253.GH7085@gate2.hazen.net>

Hi Rijish-

The python-dev list is for developers *of* python, not for people
developing *with* python.  I'd recommend you post this on the
python-list, but I see you already have.  You'll find they can be very
helpful, if you show that you've done some research, and ask a specific
question.  A subject line always helps, too.

Good luck with your project!

-John

* rijish valoorthodi rajan <rijishvr at rediffmail.com> [2005-05-02 00:29]:
>   
> hello all
> I am a member of a team dedicated to make a client server database application and our main concern is the speed with which the system performs. we are very new to python. but after reading a lot of documents and consulting some experts we decided to work it out using PYTHON. we plan to make 2 programs one running in the client systems and one that run in server. can any one please help me by telling the thins that i should take care of while designing the project and what tools and what style we should adopt to make the program optimised.
> 
> regards

From shane at hathawaymix.org  Mon May  2 15:46:31 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Mon, 02 May 2005 07:46:31 -0600
Subject: [Python-Dev] PEP 340: Else clause for block statements
In-Reply-To: <9B1795C95533CA46A83BA1EAD4B01030031EE8@flonidanmail.flonidan.net>
References: <9B1795C95533CA46A83BA1EAD4B01030031EE8@flonidanmail.flonidan.net>
Message-ID: <42762F37.9000605@hathawaymix.org>

Anders J. Munch wrote:
>     in opening('file1') as f1:
>         ...
>         in opening('file2') as f2:
>             ...
>     except IOError:
>         print "file1 not available, I'll try again later"
> 
> I rather like this version, because it is patently clear what should
> happen if there is no except-clause: The exception propagates
> normally.

My eyes would expect the exception handler to also catch IOErrors
generated inside the block statement body.  My eyes would be deceiving
me, of course, but Python isn't currently so subtle and it probably
shouldn't be.

You could also do this with a suitable iterator.

    def opening_or_skipping(fn):
        try:
            f = open(fn)
        except IOError:
            print "file1 not available, I'll try again later"
        else:
            try:
                yield f
            finally:
                f.close()

Shane

From skip at pobox.com  Mon May  2 15:46:31 2005
From: skip at pobox.com (Skip Montanaro)
Date: Mon, 2 May 2005 08:46:31 -0500
Subject: [Python-Dev] PEP 340: Else clause for block statements
In-Reply-To: <9B1795C95533CA46A83BA1EAD4B01030031EE8@flonidanmail.flonidan.net>
References: <9B1795C95533CA46A83BA1EAD4B01030031EE8@flonidanmail.flonidan.net>
Message-ID: <17014.12087.543854.75179@montanaro.dyndns.org>


    Anders> How do I tell try/except that I really only meant to trap
    Anders> opening('file1'), but opening 'file2' is not supposed to fail so
    Anders> I want any exception from that propagated?  Better if I could
    Anders> write:

    Anders>     in opening('file1') as f1:
    Anders>         ...
    Anders>         in opening('file2') as f2:
    Anders>             ...
    Anders>     else:
    Anders>         print "file1 not available, I'll try again later"

-1.  This has the opposite meaning of the else clause in while/for
statements.

    Anders> or even

    Anders>     in opening('file1') as f1:
    Anders>         ...
    Anders>         in opening('file2') as f2:
    Anders>             ...
    Anders>     except IOError:
    Anders>         print "file1 not available, I'll try again later"

Not keen on this either, maybe just because the "in" clause isn't a "try"
clause.

Skip


From exarkun at divmod.com  Mon May  2 16:02:49 2005
From: exarkun at divmod.com (Jp Calderone)
Date: Mon, 02 May 2005 14:02:49 GMT
Subject: [Python-Dev] PEP 340: Else clause for block statements
In-Reply-To: <42762F37.9000605@hathawaymix.org>
Message-ID: <20050502140249.15422.1339448410.divmod.quotient.13812@ohm>

On Mon, 02 May 2005 07:46:31 -0600, Shane Hathaway <shane at hathawaymix.org> wrote:
>Anders J. Munch wrote:
>>     in opening('file1') as f1:
>>         ...
>>         in opening('file2') as f2:
>>             ...
>>     except IOError:
>>         print "file1 not available, I'll try again later"
>>
>> I rather like this version, because it is patently clear what should
>> happen if there is no except-clause: The exception propagates
>> normally.
>
>My eyes would expect the exception handler to also catch IOErrors
>generated inside the block statement body.  My eyes would be deceiving
>me, of course, but Python isn't currently so subtle and it probably
>shouldn't be.
>
>You could also do this with a suitable iterator.
>
>    def opening_or_skipping(fn):
>        try:
>            f = open(fn)
>        except IOError:
>            print "file1 not available, I'll try again later"
>        else:
>            try:
>                yield f
>            finally:
>                f.close()

  I don't think this version is really of much use.  It requires that you implement a different iterator for each kind of error handling you want to do.  Avoiding multiple different implementations is supposed to be one of the main selling points of this feature.

  Jp

From lcaamano at gmail.com  Mon May  2 16:18:07 2005
From: lcaamano at gmail.com (Luis P Caamano)
Date: Mon, 2 May 2005 10:18:07 -0400
Subject: [Python-Dev] PEP 340 - possible new name for block-statement
In-Reply-To: <20050430005315.C9A7D1E400B@bag.python.org>
References: <20050430005315.C9A7D1E400B@bag.python.org>
Message-ID: <c56e219d05050207187cd146a2@mail.gmail.com>

On 4/29/05, Reinhold Birkenfeld  wrote:
> Date: Sat, 30 Apr 2005 00:53:12 +0200
> From: Reinhold Birkenfeld <reinhold-birkenfeld-nospam at wolke7.net>
> Subject: [Python-Dev] Re: PEP 340 - possible new name for
>        block-statement
> To: python-dev at python.org
> Message-ID: <d4udl0$7j3$1 at sea.gmane.org>
> Content-Type: text/plain; charset=ISO-8859-1
> 
> 
> FWIW, the first association when seeing
> 
> block something:
> 
> is with the verb "to block", and not with the noun, which is most displeasing.
> 
> Reinhold
> 

Which is the reason I thought of "bracket" instead.  Although it's also a
noun and a verb, the verb doesn't imply "stop" like block does.  However,
because one of the main features of python is that it's easy to read,
adding "with" to it makes it very clear as in "bracket_with".  Ugly at
first, but that's just a matter of familiarity.  You never notice that your
ugly friend is really that ugly anymore, right?

bracket_with foo(arg1, arg2) as f:
  BLOCK

seems very explicit to me.

However, I do prefer no keyword at all and that would be my first choice,
but if we have to choose a keyword, "block" has that "stop" connotation
that will certainly confuse more than a few but I doubt people would
go with "bracket_with."  

I certainly hope no-keyword is possible.

-- 
Luis P Caamano
Atlanta, GA USA

From gvanrossum at gmail.com  Mon May  2 16:57:19 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Mon, 2 May 2005 07:57:19 -0700
Subject: [Python-Dev] PEP 340: Else clause for block statements
In-Reply-To: <9B1795C95533CA46A83BA1EAD4B01030031EE8@flonidanmail.flonidan.net>
References: <9B1795C95533CA46A83BA1EAD4B01030031EE8@flonidanmail.flonidan.net>
Message-ID: <ca471dc20505020757316a2f47@mail.gmail.com>

[Guido, prsenting a use case]
> >   block locking(myLock, timeout=30):
> >       ...code executed with lock held...
> >   else:
> >       ...code executed if lock not acquired...

[Anders Munch]
> A file-closing block function has the same need, as does any block
> function that manages a resource, whose acquisition might fail.
> 
> A surrounding try/except doesn't quite cut it; the problem is that the
> try-clause will then also cover the suite.
> 
> Example:
> 
>     try:
>         in opening('file1') as f1:
>             ...
>             in opening('file2') as f2:
>                 ...
>     except IOError:
>         print "file1 not available, I'll try again later"

I thought of this and several other solutions overnight and didn't lik
any. Finally I realized that this use case is better covered by
letting the generator return an error value:

def opening_w_err(filename):
    try:
        f = open(filename)
    except IOError, err:
        yield None, err
    else:
        try:
            yield f, None
        finally:
            f.close()

The user can then write:

block opening_w_err(filename) as f, err:
    if f:
        ...code using f...
    else:
        ...error handling code using err...

Besides, in many cases it's totally acceptable to put a try/except
block around the entire block-statement, if the exception it catches
is specific enough (like IOError). For example:

try:
    block opening(filename) as f:
        ...code using f...
except IOError, err:
    ...error handling code using err...

So I'm more than ever in favor of keeping the block-statement simple,
i.e. without any additional clauses.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From walter at livinglogic.de  Mon May  2 18:06:58 2005
From: walter at livinglogic.de (=?ISO-8859-1?Q?Walter_D=F6rwald?=)
Date: Mon, 02 May 2005 18:06:58 +0200
Subject: [Python-Dev] Generating nested data structures with blocks
Message-ID: <42765022.1080900@livinglogic.de>

Reading PEP 340, it seems to me that blocks could be used for generating 
nested data structures:

def blist(list):
	def enter(parent=None):
		if parent:
			parent.append(self)
		yield self

x = blist()
block x.enter() as x:
	x.append(1)
	block blist().enter(x) as x:
		x.append(2)
	x.append(3)

print x

this should print [1, [2], 3]

For this to work, the scope of the block variable has to end with the 
end of the block. Currently the PEP leaves this unspecified.

Bye,
    Walter D?rwald

From walter at livinglogic.de  Mon May  2 18:40:06 2005
From: walter at livinglogic.de (=?ISO-8859-1?Q?Walter_D=F6rwald?=)
Date: Mon, 02 May 2005 18:40:06 +0200
Subject: [Python-Dev] Generating nested data structures with blocks
In-Reply-To: <42765022.1080900@livinglogic.de>
References: <42765022.1080900@livinglogic.de>
Message-ID: <427657E6.9080007@livinglogic.de>

Walter D?rwald wrote:

> [...]
> def blist(list):
> 	def enter(parent=None):

Of course this was meant to be:

class blist(list):
     der enter(self, parent=None):

Bye,
    Walter D?rwald

From gvanrossum at gmail.com  Tue May  3 02:55:56 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Mon, 2 May 2005 17:55:56 -0700
Subject: [Python-Dev] PEP 340 -- loose ends
Message-ID: <ca471dc20505021755518773c8@mail.gmail.com>

These are the loose ends on the PEP (apart from filling in some
missing sections):

1. Decide on a keyword to use, if any.

2. Decide on the else clause.

3. Decide on Phillip Eby's proposal to have a different API for
blocks, so you would have to use a @decorator to turn a generator into
something usable in a block.

Here are my strawman decisions, to the extent that I'm clear on them:

1. I still can't decide on keyword vs. no keyword, but if we're going
to have a keyword, I haven't seen a better proposal than block. So
it's either block or nothing. I'll sleep on this. Feel free to start
an all-out flame war on this in c.l.py. ;-)

2. No else clause; the use case is really weak and there are too many
possible semantics. It's not clear whether to generalize from
for/else, or if/else, or what else.

3. I'm leaning against Phillip's proposal; IMO it adds more complexity
for very little benefit.

Unless there's more discussion on any of these, I'll probably finish
up the PEP and post it to c.l.py in a few days.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Tue May  3 03:39:19 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Mon, 02 May 2005 21:39:19 -0400
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <ca471dc20505021755518773c8@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050502210840.031fb9b0@mail.telecommunity.com>

At 05:55 PM 5/2/05 -0700, Guido van Rossum wrote:
>3. I'm leaning against Phillip's proposal; IMO it adds more complexity
>for very little benefit.

Little benefit, I'll agree with, even though there are EIBTI and TOOWTDI 
benefits as well as Errors Should Never Pass Silently.  But the only added 
implementation complexity is the decorator -- balanced against the removal 
of the need for a 'next()' builtin.  I also believe that the approach 
actually *reduces* pedagogical complexity by not allowing any blurring 
between the concept of an iterator and the concept of a block template.

Since I'm not sure if anybody besides you is aware of what I proposed, I'll 
attempt to recap here, and then step to allow discussion.  If there's no 
community support, I'll let it die a natural death, because it's ultimately 
a "purity" question rather than a practical one, though I think that other 
people who teach Python programming should weigh in on this.

Specifically, I propose that PEP 340 *not* allow the use of "normal" 
iterators.  Instead, the __next__ and __exit__ methods would be an 
unrelated protocol.  This would eliminate the need for a 'next()' builtin, 
and avoid any confusion between today's iterators and a template function 
for use with blocks.

Because today's generators were also not written with blocks in mind, it 
would also be necessary to use a @decorator to declare that a generator is 
in fact a block template.  Possibly something like:

     @blocktemplate
     def retry(times):
         for i in xrange(times):
             try:
                 yield
             except StopIteration:
                 return
             except:
                 continue
             else:
                 return
         raise

My argument is that this is both Explicit (i.e., better than implicit) and 
One Obvious Way (because using existing iterators just Another Way to do a 
"for" loop).  It also doesn't allow Errors (using an iterator with no 
special semantics) to Pass Silently.

Of course, since Practicality Beats Purity, I could give this all up.  But 
I don't think the Implementation is Hard to Explain, as it should be just 
as easy as Guido's proposal.  Instead of a 'next()' builtin, one would 
instead implement a 'blocktemplate' decorator (or whatever it's to be 
called).  The same __next__/__exit__/next methods have to be implemented as 
in Guido's proposal.  Really, the only thing that changes is that you get a 
TypeError when a template function returns an iterator instead of a block 
template, and you have to use the decorator on your generators to 
explicitly label them safe for use with blocks.  (Hand-crafted block 
templates still just implement __next__ and __exit__, in the same way as 
they would under Guido's proposal, so no real change there.)

Guido may also have other reasons to take a different direction that he may 
not have expressed; e.g. maybe in Py3K there'll be no "for", just "iter(x) 
as y:"?  Or...?

I don't claim to have any special smarts about this, but other people 
(including Guido) have previously expressed reservations about the 
near-blending of iteration and block control that PEP 340 allows.  So, I've 
thrown out this proposal as an attempt to address those reservations.  YMMV.


From pje at telecommunity.com  Tue May  3 03:43:02 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Mon, 02 May 2005 21:43:02 -0400
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <5.1.1.6.0.20050502210840.031fb9b0@mail.telecommunity.com>
References: <ca471dc20505021755518773c8@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050502214241.02c3a890@mail.telecommunity.com>

At 09:39 PM 5/2/05 -0400, Phillip J. Eby wrote:
>attempt to recap here, and then step to allow discussion.  If there's no

Argh.  That was supposed to be, "step aside to allow discussion".


From tdelaney at avaya.com  Tue May  3 04:14:36 2005
From: tdelaney at avaya.com (Delaney, Timothy C (Timothy))
Date: Tue, 3 May 2005 12:14:36 +1000
Subject: [Python-Dev] PEP 340 -- loose ends
Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE721275@au3010avexu1.global.avaya.com>

Phillip J. Eby wrote:

> Specifically, I propose that PEP 340 *not* allow the use of "normal"
> iterators.  Instead, the __next__ and __exit__ methods would be an
> unrelated protocol.  This would eliminate the need for a 'next()'
> builtin, 
> and avoid any confusion between today's iterators and a template
> function 
> for use with blocks.

PEP 340 does not address "normal" iterators very well, but a
properly-constructed iterator will behave correctly.

The PEP though is very generator-focussed. The issues I see for "normal"
iterators (and that need to be addressed/stated in the PEP) are:

    1. No automatic handling of parameters passed to __next__ and
__exit__.
       In a generator, these will raise at the yield-statement or
-expression.
       A "normal" iterator will have to take care of this manually.

This could be an argument to only allow generator-iterators to be used
with PEP 340 semantics (i.e. continue <EXPR>, block), but I don't think
it's a very compelling one.

Although perhaps the initial implementation could be restricted to
generator-iterators. So if a for-loop used `continue <EXPR>` it would
have a check (at the start of the for loop) that the iterator is a
generator-iterator. Likewise, a block-statement would always include
this check.

As another option, it might be worthwhile creating a base iterator type
with "correct" semantics.

Tim Delaney

From gvanrossum at gmail.com  Tue May  3 04:33:08 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Mon, 2 May 2005 19:33:08 -0700
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <338366A6D2E2CA4C9DAEAE652E12A1DE721275@au3010avexu1.global.avaya.com>
References: <338366A6D2E2CA4C9DAEAE652E12A1DE721275@au3010avexu1.global.avaya.com>
Message-ID: <ca471dc2050502193376781121@mail.gmail.com>

[Delaney, Timothy]
> PEP 340 does not address "normal" iterators very well, but a
> properly-constructed iterator will behave correctly.

This is by design.

> The PEP though is very generator-focussed.

Disagree. The PEP describes most everything (e.g. the block statement
semantics) in terms of iterators, and then describes how the new APIs
behave for generators.

> The issues I see for "normal"
> iterators (and that need to be addressed/stated in the PEP) are:
> 
>     1. No automatic handling of parameters passed to __next__ and __exit__.
>        In a generator, these will raise at the yield-statement or -expression.
>        A "normal" iterator will have to take care of this manually.

Not sure what you mean by this. If __next__() is defined, it is passed
the parameter; if only next() is defined, a parameter (except None) is
an error. That seems exactly right. Also, if __exit__() isn't defined,
the exception is raised, which is a very sensible default behavior
(and also what will happen to a generator that doesn't catch the
exception).

> This could be an argument to only allow generator-iterators to be used
> with PEP 340 semantics (i.e. continue <EXPR>, block), but I don't think
> it's a very compelling one.

Neither do I. :-)

> Although perhaps the initial implementation could be restricted to
> generator-iterators. So if a for-loop used `continue <EXPR>` it would
> have a check (at the start of the for loop) that the iterator is a
> generator-iterator. Likewise, a block-statement would always include
> this check.

But what would this buy you except an arbitrary restriction?

> As another option, it might be worthwhile creating a base iterator type
> with "correct" semantics.

Well, what would the "correct" semantics be? What would passing a
parameter to a list iterator's __next__() method mean?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From tdelaney at avaya.com  Tue May  3 04:53:03 2005
From: tdelaney at avaya.com (Delaney, Timothy C (Timothy))
Date: Tue, 3 May 2005 12:53:03 +1000
Subject: [Python-Dev] PEP 340 -- loose ends
Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE721276@au3010avexu1.global.avaya.com>

Guido van Rossum wrote:

> [Delaney, Timothy]
>> PEP 340 does not address "normal" iterators very well, but a
>> properly-constructed iterator will behave correctly.
> 
> This is by design.

Yep - I agree.

>> The PEP though is very generator-focussed.
> 
> Disagree. The PEP describes most everything (e.g. the block statement
> semantics) in terms of iterators, and then describes how the new APIs
> behave for generators.

Again, agree. What I meant is that there are no examples of how to
actually implement the correct semantics for a normal iterator. Doing it
right is non-trivial, especially with the __next__ and __exit__
interaction (see below).

>> The issues I see for "normal"
>> iterators (and that need to be addressed/stated in the PEP) are:
>> 
>>     1. No automatic handling of parameters passed to __next__ and
>>        __exit__. In a generator, these will raise at the yield
>>        -statement or -expression. A "normal" iterator will have
>>        to take care of this manually. 
> 
> Not sure what you mean by this. If __next__() is defined, it is passed
> the parameter; if only next() is defined, a parameter (except None) is
> an error. That seems exactly right. Also, if __exit__() isn't defined,
> the exception is raised, which is a very sensible default behavior
> (and also what will happen to a generator that doesn't catch the
> exception).

What I meant is how the iterator is meant to handle the parameters
passed to each method. PEP 340 deals with this by stating that
exceptions will be raised at the next yield-statement or -expression. I
think we need an example though of how this would translate to a
"normal" iterator. Something along the lines of::

    class iterator (object):

        def next (self):
            return self.__next__()

        def __next__(self, arg=None):
            value = None

            if isinstance(arg, ContinueIteration):
                value = arg.value
            elif arg is not None:
                raise arg

            if value is None:
                raise StopIteration

            return value

        def __exit__(self, type=None, value=None, traceback=None):
            if (type is None) and (value is None) and (traceback is
None):
                type, value, traceback = sys.exc_info()

            if type is not None:
                try:
                    raise type, value, traceback
                except type, exc:
                    return self.__next__(exc)
 
           return self.__next__()

>> As another option, it might be worthwhile creating a base iterator
type
>> with "correct" semantics.

> Well, what would the "correct" semantics be? What would passing a
> parameter to a list iterator's __next__() method mean?

Sorry - I meant for user-defined iterators. And the correct semantics
would be something like the example above I think. Except that I think
most of it would need to be in a separate method (e.g. _next) for base
classes to call - then things would change to be something like::

    class iterator (object):
        ...

        def _next (self, arg):
            if isinstance(arg, ContinueIteration):
                return arg.value
            elif arg is not None:
                raise arg

        def __next__(self, arg=None):
            value = self._next(arg)

            if value is None:
                raise StopIteration

            return value            

        ...

Finally, I think there is another loose end that hasn't been addressed::

    When __next__() is called with an argument that is not None, the
    yield-expression that it resumes will return the value attribute
    of the argument.  If it resumes a yield-statement, the value is
    ignored (or should this be considered an error?).  When the
    *initial* call to __next__() receives an argument that is not
    None, the generator's execution is started normally; the
    argument's value attribute is ignored (or should this be
    considered an error?).  When __next__() is called without an
    argument or with None as argument, and a yield-expression is
    resumed, the yield-expression returns None.

My opinion is that each of these should be an error.

Tim Delaney

From gvanrossum at gmail.com  Tue May  3 06:05:38 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Mon, 2 May 2005 21:05:38 -0700
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <338366A6D2E2CA4C9DAEAE652E12A1DE721276@au3010avexu1.global.avaya.com>
References: <338366A6D2E2CA4C9DAEAE652E12A1DE721276@au3010avexu1.global.avaya.com>
Message-ID: <ca471dc20505022105212376cf@mail.gmail.com>

[Delaney, Timothy]
> What I meant is that there are no examples of how to
> actually implement the correct semantics for a normal iterator. Doing it
> right is non-trivial, especially with the __next__ and __exit__
> interaction (see below).

Depends on what you mean by right. Ignoring the argument to __next__()
and not implementing __exit__() seems totally "right" to me.

[...]
> What I meant is how the iterator is meant to handle the parameters
> passed to each method. PEP 340 deals with this by stating that
> exceptions will be raised at the next yield-statement or -expression. I
> think we need an example though of how this would translate to a
> "normal" iterator. Something along the lines of::
> 
>     class iterator (object):
> 
>         def next (self):
>             return self.__next__()
> 
>         def __next__(self, arg=None):
>             value = None
> 
>             if isinstance(arg, ContinueIteration):

Oops. Read the most recent version of the PEP again. __next__()
doesn't take an exception argument, it only takes a value. Maybe this
removes your concern?

>                 value = arg.value
>             elif arg is not None:
>                 raise arg
> 
>             if value is None:
>                 raise StopIteration
> 
>             return value

That's a very strange iterator; it immediately terminates unless you
call __next__() with a non-None argument, then it returns the argument
value. I'm having a hard time understanding what you meant to say.
Also note that the very *first* call to __next__() is not supposed to
have an argument. The argument (normally) only comes from "continue
EXPR" and that can only be reached after the first call to __next__().
This is exactly right for generators -- the first __next__() call
there "starts" the generator at the top of its function body,
executing until the first yield is reached.

>         def __exit__(self, type=None, value=None, traceback=None):
>             if (type is None) and (value is None) and (traceback is None):
>                 type, value, traceback = sys.exc_info()

You shouldn't need to check for traceback is None.

Also, even though the PEP suggests that you can do this, I don't see a
use case for it -- the translation of a block-statement never calls
__exit__() without an exception.

>             if type is not None:
>                 try:
>                     raise type, value, traceback
>                 except type, exc:
>                     return self.__next__(exc)
> 
>            return self.__next__()

Ah, here we see the other misconception (caused by not reading the
most recent version of the PEP). __exit__() shouldn't call __next__()
-- it should just raise the exception passed in unless it has
something special to do.

Let me clarify all this with an example showing how you could write
"synchronized()" as an iterator instead of a generator.

class synchronized:
    def __init__(self, lock):
        self.lock = lock
        self.state = 0
    def __next__(self, arg=None):
        # ignores arg
        if self.state:
            assert self.state == 1
            self.lock.release()
            self.state += 1
            raise StopIteration
        else:
            self.lock.acquire()
            self.state += 1
            return None
    def __exit__(self, type, value=None, traceback=None):
        assert self.state in (0, 1, 2)
        if self.state == 1:
            self.lock.release()
        raise type, value, traceback

> >> As another option, it might be worthwhile creating a base iterator type
> >> with "correct" semantics.
> 
> > Well, what would the "correct" semantics be? What would passing a
> > parameter to a list iterator's __next__() method mean?
> 
> Sorry - I meant for user-defined iterators. And the correct semantics
> would be something like the example above I think. Except that I think
> most of it would need to be in a separate method (e.g. _next) for base
> classes to call - then things would change to be something like::
> 
>     class iterator (object):
>         ...
> 
>         def _next (self, arg):
>             if isinstance(arg, ContinueIteration):
>                 return arg.value
>             elif arg is not None:
>                 raise arg
> 
>         def __next__(self, arg=None):
>             value = self._next(arg)
> 
>             if value is None:
>                 raise StopIteration
> 
>             return value
> 
>         ...

I think this is all based on a misunderstanding of the PEP.

Also, you really don't need to implement __exit__() unless you have
some cleanup to do -- the default behavior of the block translation
only calls it if defined, and otherwise simply raises the exception.

> Finally, I think there is another loose end that hasn't been addressed::
> 
>     When __next__() is called with an argument that is not None, the
>     yield-expression that it resumes will return the value attribute
>     of the argument.  If it resumes a yield-statement, the value is
>     ignored (or should this be considered an error?).  When the
>     *initial* call to __next__() receives an argument that is not
>     None, the generator's execution is started normally; the
>     argument's value attribute is ignored (or should this be
>     considered an error?).  When __next__() is called without an
>     argument or with None as argument, and a yield-expression is
>     resumed, the yield-expression returns None.

Good catch.

> My opinion is that each of these should be an error.

Personally, I think not using the value passed into __next__() should
not be an error; that's about the same as not using the value returned
by a function you call. There are all sorts of reasons for doing that.
In a very early version of Python, the result of an expression that
wasn't used would be printed unless it was None (it still does this at
the interactive prompt); this was universally hated.

I agree that calling the initial __next__() of a generator with a
non-None argument should be considered an error; this is likely caused
by some kind of logic error; it can never happen when the generator is
called by a block statement.

I'll update the PEP to reflect this.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From kbk at shore.net  Tue May  3 06:25:58 2005
From: kbk at shore.net (Kurt B. Kaiser)
Date: Tue, 3 May 2005 00:25:58 -0400 (EDT)
Subject: [Python-Dev] Weekly Python Patch/Bug Summary
Message-ID: <200505030426.j434PwB1027617@bayview.thirdcreek.com>

Patch / Bug Summary
___________________

Patches :  322 open ( +6) /  2832 closed ( +1) /  3154 total ( +7)
Bugs    :  920 open (+12) /  4952 closed (+11) /  5872 total (+23)
RFE     :  186 open ( +8) /   156 closed ( +3) /   342 total (+11)

New / Reopened Patches
______________________

Info Associated with Merge to AST  (2005-01-07)
       http://python.org/sf/1097671  reopened by  kbk

Minimal cleanup of run.py  (2005-04-26)
       http://python.org/sf/1190163  opened by  Michiel de Hoon

socketmodule.c's recvfrom on OSF/1 4.0  (2005-04-27)
       http://python.org/sf/1191065  opened by  Marcel Martin

Simplify logic in random.py  (2005-04-28)
CLOSED http://python.org/sf/1191489  opened by  Raymond Hettinger

wrong offsets in bpprint()  (2005-04-28)
       http://python.org/sf/1191700  opened by  Pechkinzzz

about shift key  (2005-04-28)
       http://python.org/sf/1191726  opened by  yang

debugger ``condition`` and ``ignore`` exception handling  (2005-04-29)
       http://python.org/sf/1192590  opened by  Jeremy Jones

Use GCC4 ELF symbol visibility  (2005-04-30)
       http://python.org/sf/1192789  opened by  James Henstridge

Patches Closed
______________

type conversion methods and subclasses  (2005-01-25)
       http://python.org/sf/1109424  closed by  bcannon

Don't assume all exceptions are SyntaxError's  (2005-04-24)
       http://python.org/sf/1189210  closed by  bcannon

Automatically build fpectl module from setup.py  (2005-04-19)
       http://python.org/sf/1185529  closed by  mwh

Simplify logic in random.py  (2005-04-28)
       http://python.org/sf/1191489  closed by  rhettinger

New / Reopened Bugs
___________________

[AST] assignment to None allowed  (2005-04-25)
CLOSED http://python.org/sf/1190010  opened by  Brett Cannon

[AST] distinct code objects not created  (2005-04-25)
       http://python.org/sf/1190011  opened by  Brett Cannon

``from sys import stdin,`` doesn't raise a SyntaxError  (2005-04-25)
       http://python.org/sf/1190012  opened by  Brett Cannon

3.29 site is confusing re site-packages on Windows  (2005-04-26)
       http://python.org/sf/1190204  opened by  Kent Johnson

6.9 First sentence is confusing  (2005-04-26)
CLOSED http://python.org/sf/1190451  opened by  Nicolas Grilly

os.waitpid docs don't specify return value for WNOHANG  (2005-04-26)
       http://python.org/sf/1190563  opened by  jls

SimpleHTTPServer sends wrong c-length and locks up client  (2005-04-26)
       http://python.org/sf/1190580  opened by  Alexander Schremmer

calendar._firstweekday is too hard-wired  (2005-04-26)
       http://python.org/sf/1190596  opened by  Tres Seaver

dir() docs show incorrect output  (2005-04-26)
CLOSED http://python.org/sf/1190599  opened by  Martin Chase

bz2 RuntimeError when decompressing file  (2005-04-27)
       http://python.org/sf/1191043  opened by  Chris AtLee

Warning ``error`` filter action is ignored.  (2005-04-27)
CLOSED http://python.org/sf/1191104  opened by  Ivan Vilata i Balaguer

[AST] Failing tests  (2005-04-27)
       http://python.org/sf/1191458  opened by  Brett Cannon

'clear -1' in pdb  (2005-04-29)
       http://python.org/sf/1192315  opened by  Pechkinzzz

doctest's ELLIPSIS and multiline statements  (2005-04-29)
CLOSED http://python.org/sf/1192554  opened by  S?bastien Boisg?rault

docstring error  (2005-04-29)
CLOSED http://python.org/sf/1192777  opened by  Christopher Smith

Notation  (2005-04-30)
       http://python.org/sf/1193001  opened by  Mythril

Python and Turkish Locale  (2005-04-30)
       http://python.org/sf/1193061  opened by  S.?a&#287;lar Onur

Embedded python thread crashes  (2005-04-30)
       http://python.org/sf/1193099  opened by  ugodiggi

Strange os.path.exists() results with invalid chars  (2005-04-30)
       http://python.org/sf/1193180  opened by  Daniele Varrazzo

MACOSX_DEPLOYMENT_TARGET checked incorrectly  (2005-04-30)
       http://python.org/sf/1193190  opened by  Bob Ippolito

os.path.expanduser documentation wrt. empty $HOME  (2005-05-02)
       http://python.org/sf/1193849  opened by  Wummel

calendar.weekheader not found in __all__  (2005-05-03)
       http://python.org/sf/1193890  opened by  George Yoshida

Weakref types documentation bugs  (2005-05-02)
       http://python.org/sf/1193966  opened by  Barry A. Warsaw

bz2.BZ2File doesn't handle modes correctly  (2005-05-02)
       http://python.org/sf/1194181  opened by  Bob Ippolito

Error in section 4.2 of Python Tutorial  (2005-05-03)
       http://python.org/sf/1194209  opened by  Andrina Kelly

Bugs Closed
___________

[ast branch] fatal error when compiling test_bool.py  (2005-03-19)
       http://python.org/sf/1166714  closed by  bcannon

[AST] assert failure on ``eval("u'\Ufffffffe'")``  (2005-04-19)
       http://python.org/sf/1186345  closed by  bcannon

"Atuple containing default argument values ..."  (2005-04-25)
       http://python.org/sf/1189819  closed by  rhettinger

[AST] assignment to None allowed  (2005-04-25)
       http://python.org/sf/1190010  closed by  bcannon

6.9 First sentence is confusing  (2005-04-26)
       http://python.org/sf/1190451  closed by  rhettinger

Python 2.4 Not Recognized by Any Programs  (2005-04-23)
       http://python.org/sf/1188637  closed by  tjreedy

Variable.__init__ uses self.set(), blocking specialization  (2005-04-07)
       http://python.org/sf/1178872  closed by  tjreedy

Compiler generates relative filenames  (2001-04-11)
       http://python.org/sf/415492  closed by  tjreedy

smtplib crashes Windows Kernal.  (2003-06-09)
       http://python.org/sf/751612  closed by  tjreedy

AssertionError from urllib.retrieve / httplib  (2003-06-15)
       http://python.org/sf/755080  closed by  tjreedy

dir() docs show incorrect output  (2005-04-26)
       http://python.org/sf/1190599  closed by  mwh

Warning ``error`` filter action is ignored.  (2005-04-27)
       http://python.org/sf/1191104  closed by  vsajip

doctest's ELLIPSIS and multiline statements  (2005-04-29)
       http://python.org/sf/1192554  closed by  boisgerault

docstring error  (2005-04-29)
       http://python.org/sf/1192777  closed by  bcannon

New / Reopened RFE
__________________

The array module and the buffer interface  (2005-04-25)
       http://python.org/sf/1190033  opened by  Josiah Carlson

logging module '.' behavior  (2005-04-26)
       http://python.org/sf/1190689  opened by  Christopher Dunn

Add 'before' and 'after' methods to Strings  (2005-04-26)
CLOSED http://python.org/sf/1190701  opened by  Christopher Dunn

cStringIO has reset(), but StringIO does not  (2005-04-27)
CLOSED http://python.org/sf/1191420  opened by  Christopher Dunn

logging module root logger name  (2005-04-26)
       http://python.org/sf/1190689  reopened by  cxdunn

slice indices different than integers  (2005-04-28)
CLOSED http://python.org/sf/1191697  opened by  Sebastien de Menten

make slices pickable  (2005-04-28)
       http://python.org/sf/1191699  opened by  Sebastien de Menten

asynchronous Subprocess  (2005-04-28)
       http://python.org/sf/1191964  opened by  Josiah Carlson

"replace" function should accept lists.  (2005-04-17)
       http://python.org/sf/1184678  reopened by  poromenos

'str'.translate(None) => identity translation  (2005-04-30)
       http://python.org/sf/1193128  opened by  Bengt Richter

add server.shutdown() method and daemon arg to SocketServer  (2005-05-02)
       http://python.org/sf/1193577  opened by  paul rubin

Expat Parser to supply document locator in incremental parse  (2005-05-02)
       http://python.org/sf/1193610  opened by  GaryD

RFE Closed
__________

Add 'before' and 'after' methods to Strings  (2005-04-26)
       http://python.org/sf/1190701  closed by  rhettinger

cStringIO has reset(), but StringIO does not  (2005-04-27)
       http://python.org/sf/1191420  closed by  rhettinger

logging module root logger name  (2005-04-27)
       http://python.org/sf/1190689  closed by  vsajip

logging module documentation  (2003-01-16)
       http://python.org/sf/668905  closed by  vsajip

slice indices different than integers  (2005-04-28)
       http://python.org/sf/1191697  closed by  mwh


From tdelaney at avaya.com  Tue May  3 06:59:42 2005
From: tdelaney at avaya.com (Delaney, Timothy C (Timothy))
Date: Tue, 3 May 2005 14:59:42 +1000
Subject: [Python-Dev] PEP 340 -- loose ends
Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE721277@au3010avexu1.global.avaya.com>

Guido van Rossum wrote:

> Oops. Read the most recent version of the PEP again. __next__()
> doesn't take an exception argument, it only takes a value. Maybe this
> removes your concern?

Actually, I misinterpreted it, assuming that the value passed in was an
exception instance because the previous versions worked that way. This
has been going on too long ;)

> Ah, here we see the other misconception (caused by not reading the
> most recent version of the PEP). __exit__() shouldn't call __next__()
> -- it should just raise the exception passed in unless it has
> something special to do.

Ah - I think this needs to be explained better. In particular, in the
specification of the __next__ and __exit__ methods it should state what
exceptions are expected to be raised under what circumstances - in
particular, that __exit__ is expected to raise the passed in exception
or StopIteration. This is only explained in the Generator Exception
Handling specification, but it's applicable to all iterators.

>> Finally, I think there is another loose end that hasn't been
>> addressed:: 
>> 
>>     When __next__() is called with an argument that is not None, the
>>     yield-expression that it resumes will return the value attribute
>>     of the argument.  If it resumes a yield-statement, the value is
>>     ignored (or should this be considered an error?).  When the
>>     *initial* call to __next__() receives an argument that is not
>>     None, the generator's execution is started normally; the
>>     argument's value attribute is ignored (or should this be
>>     considered an error?).  When __next__() is called without an
>>     argument or with None as argument, and a yield-expression is
>>     resumed, the yield-expression returns None.
> 
> Good catch.
> 
>> My opinion is that each of these should be an error.
> 
> Personally, I think not using the value passed into __next__() should
> not be an error; that's about the same as not using the value returned
> by a function you call.

Now that I understand that the parameter to __next__ is not an
exception, I agree.

> I agree that calling the initial __next__() of a generator with a
> non-None argument should be considered an error; this is likely caused
> by some kind of logic error; it can never happen when the generator is
> called by a block statement.

Cheers.

Tim Delaney

From ncoghlan at gmail.com  Tue May  3 11:28:40 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 03 May 2005 19:28:40 +1000
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <5.1.1.6.0.20050502210840.031fb9b0@mail.telecommunity.com>
References: <5.1.1.6.0.20050502210840.031fb9b0@mail.telecommunity.com>
Message-ID: <42774448.8080306@gmail.com>

Phillip J. Eby wrote:
> Specifically, I propose that PEP 340 *not* allow the use of "normal" 
> iterators.  Instead, the __next__ and __exit__ methods would be an 
> unrelated protocol.  This would eliminate the need for a 'next()' builtin, 
> and avoid any confusion between today's iterators and a template function 
> for use with blocks.

I would extend this to say that invoking the blocktemplate decorator should 
eliminate the conventional iteration interface, preventing the following 
problematically silent bug:

   for l in synchronized(mylock):
     # This lock is not released promptly!
     break

> My argument is that this is both Explicit (i.e., better than implicit) and 
> One Obvious Way (because using existing iterators just Another Way to do a 
> "for" loop).  It also doesn't allow Errors (using an iterator with no 
> special semantics) to Pass Silently.

While I agree these are advantages, a bigger issue for me would be the one 
above: keeping a block template which expects prompt finalisation from being 
inadvertently used in a conventional for loop which won't finalise on early 
termination of the loop.

I'd also suggest that the blocktemplate decorator accept any iterator, not just 
generators.

> Of course, since Practicality Beats Purity, I could give this all up.  But 
> I don't think the Implementation is Hard to Explain, as it should be just 
> as easy as Guido's proposal.

I think it would be marginally easier to explain, since the confusion between 
iterators and block templates would be less of a distraction.

>  Really, the only thing that changes is that you get a 
> TypeError when a template function returns an iterator instead of a block 
> template, and you have to use the decorator on your generators to 
> explicitly label them safe for use with blocks.

I'd add raising a TypeError when a block template is passed to the iter() 
builtin to the list of differences from the current incarnation of the PEP.

As for Phillip, I think using different API's is a good way to more clearly 
emphasise the difference in purpose between conventional for loops and the new 
block statement, but I'm also a little concerned about incorrectly passing a 
block template to a for loop.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From ncoghlan at gmail.com  Tue May  3 11:33:35 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 03 May 2005 19:33:35 +1000
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <ca471dc20505021755518773c8@mail.gmail.com>
References: <ca471dc20505021755518773c8@mail.gmail.com>
Message-ID: <4277456F.7090402@gmail.com>

Guido van Rossum wrote:
> 1. I still can't decide on keyword vs. no keyword, but if we're going
> to have a keyword, I haven't seen a better proposal than block. So
> it's either block or nothing. I'll sleep on this. Feel free to start
> an all-out flame war on this in c.l.py. ;-)

I quite like 'block', but can live with no keyword (since it then becomes a 
practical equivalent to user-defined statements).

> 2. No else clause; the use case is really weak and there are too many
> possible semantics. It's not clear whether to generalize from
> for/else, or if/else, or what else.

Agreed. The order I posted my list of semantic options was the order I thought 
of them, but I ended up agreeing with the votes Aahz posted.

> 3. I'm leaning against Phillip's proposal; IMO it adds more complexity
> for very little benefit.

See my response to Phillip. I think there could be an advantage to it if it 
means that "for l in synchronized(lock)" raises an immediate error instead of 
silently doing the wrong thing.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From pierre.barbier at cirad.fr  Tue May  3 13:32:18 2005
From: pierre.barbier at cirad.fr (Pierre Barbier de Reuille)
Date: Tue, 03 May 2005 13:32:18 +0200
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <4277456F.7090402@gmail.com>
References: <ca471dc20505021755518773c8@mail.gmail.com>
	<4277456F.7090402@gmail.com>
Message-ID: <42776142.9010006@cirad.fr>

Nick Coghlan a ?crit :

>>3. I'm leaning against Phillip's proposal; IMO it adds more complexity
>>for very little benefit.
> 
> 
> See my response to Phillip. I think there could be an advantage to it if it 
> means that "for l in synchronized(lock)" raises an immediate error instead of 
> silently doing the wrong thing.

First, I really think this PEP is needed for Python. But this is express 
exactly my main concern about it ! As far as I understand it, 
iterator-for-blocks and iterator-for-loops are two different beasts. 
Even if iterator-for-loops can be used within a block without damage, 
the use of iterator-for-block in a loop can lead to completely 
unpredictable result (and result really hard to find since they'll 
possibly involve race conditions or dead locks).

To try being as clear as possible, I would say the iterator-for-loops 
are simplified iterator-for-blocks. IOW, if I were to put them in a 
class inheritance hierarchy (I don't say we should put them into one ;) 
) iterator-for-block would be the base class of iterator-for-loop. Thus, 
as for-loops require an iterator-for-loop, they would raise an error if 
used with an iterator-for-block. But as blocks require an 
iterator-for-blocks they will allow iterator-for-loops too !

Cheers,

Pierre

-- 
Pierre Barbier de Reuille

INRA - UMR Cirad/Inra/Cnrs/Univ.MontpellierII AMAP
Botanique et Bio-informatique de l'Architecture des Plantes
TA40/PSII, Boulevard de la Lironde
34398 MONTPELLIER CEDEX 5, France

tel   : (33) 4 67 61 65 77    fax   : (33) 4 67 61 56 68

From m.u.k.2 at gawab.com  Tue May  3 13:42:11 2005
From: m.u.k.2 at gawab.com (m.u.k)
Date: Tue, 3 May 2005 11:42:11 +0000 (UTC)
Subject: [Python-Dev] Need to hook Py_FatalError
Message-ID: <Xns964B980C943F1token@80.91.229.5>

Greetings,

Currently Py_FatalError only dumps the error to stderr and calls abort().
When doing quirky things with the interpreter, it's so annoying that process 
just terminates. Are there any reason why we still dont have a simple 
callback to hook Py_FatalError.

PS. If the answer is "because no one needs/implemented...", I can volunteer.


Best regards.



From ncoghlan at gmail.com  Tue May  3 14:24:17 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 03 May 2005 22:24:17 +1000
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <20050503063140.A7719@familjen.svensson.org>
References: <5.1.1.6.0.20050502210840.031fb9b0@mail.telecommunity.com>
	<42774448.8080306@gmail.com>
	<20050503063140.A7719@familjen.svensson.org>
Message-ID: <42776D71.6080303@gmail.com>

Paul Svensson wrote:
> On Tue, 3 May 2005, Nick Coghlan wrote:
> 
>> I'd also suggest that the blocktemplate decorator accept any iterator, 
>> not just
>> generators.
> 
> 
> So you want decorators on classes now ?

A decorator is just a function - it doesn't *need* to be used with decorator 
syntax. I just think the following code should work for any iterator:

   block blocktemplate(itr):
     # Do stuff

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From ncoghlan at gmail.com  Tue May  3 15:07:07 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 03 May 2005 23:07:07 +1000
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <42776142.9010006@cirad.fr>
References: <ca471dc20505021755518773c8@mail.gmail.com>	<4277456F.7090402@gmail.com>
	<42776142.9010006@cirad.fr>
Message-ID: <4277777B.3000300@gmail.com>

Pierre Barbier de Reuille wrote:
> Even if iterator-for-loops can be used within a block without damage, 
> the use of iterator-for-block in a loop can lead to completely 
> unpredictable result (and result really hard to find since they'll 
> possibly involve race conditions or dead locks).

I had a longish post written before I realised I'd completely misunderstood your 
comment. You were actually agreeing with me, so most of my post was totally 
beside the point.

Anyway, to summarise the argument in favour of separate API's for iterators and 
block templates, the first code example below is a harmless quirk (albeit an 
irritating violation of TOOWTDI). The second and third examples are potentially 
serious bugs:

   block range(10) as i:
     # Just a silly way to write "for i in range(10)"

   for f in opening(name):
     # When f gets closed is Python implementation dependent

   for lock in synchronized(mylock):
     # When lock gets released is Python implementation dependent

Cheers,
Nick.

P.S. Dear lord, synchronized is an aggravating name for that function. I keep 
wanting to spell it with a second letter 's', like any civilised person ;)

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From eric.nieuwland at xs4all.nl  Tue May  3 15:08:08 2005
From: eric.nieuwland at xs4all.nl (Eric Nieuwland)
Date: Tue, 3 May 2005 15:08:08 +0200
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <42776142.9010006@cirad.fr>
References: <ca471dc20505021755518773c8@mail.gmail.com>
	<4277456F.7090402@gmail.com> <42776142.9010006@cirad.fr>
Message-ID: <2382e9f0bf39f58c4770f1b37085f716@xs4all.nl>

I've been away for a while and just read through the PEP 340 discussion 
with growing amazement.

Pierre Barbier de Reuille wrote:
> As far as I understand it,
> iterator-for-blocks and iterator-for-loops are two different beasts.
Right!

> To try being as clear as possible, I would say the iterator-for-loops
> are simplified iterator-for-blocks. IOW, if I were to put them in a
> class inheritance hierarchy (I don't say we should put them into one ;)
> ) iterator-for-block would be the base class of iterator-for-loop. 
> Thus,
> as for-loops require an iterator-for-loop, they would raise an error if
> used with an iterator-for-block. But as blocks require an
> iterator-for-blocks they will allow iterator-for-loops too !
IMHO It is more like round holes and square pegs (or the other way 
around).

What PEP 340 seems to be trying to achieve is a generic mechanism to 
define templates with holes/place holders for blocks of code. That 
gives two nouns ('template' and 'code block') that both qualify as 
indicators of reusable items.

We can use standard functions as reusable code blocks. Wouldn't a 
template then be just a function that takes other functions ar 
arguments? All information transfer between the template and its 
arguments is via the parameter list/returned values.

What am I missing?

--eric


From mwh at python.net  Tue May  3 16:35:35 2005
From: mwh at python.net (Michael Hudson)
Date: Tue, 03 May 2005 15:35:35 +0100
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <42776D71.6080303@gmail.com> (Nick Coghlan's message of "Tue,
	03 May 2005 22:24:17 +1000")
References: <5.1.1.6.0.20050502210840.031fb9b0@mail.telecommunity.com>
	<42774448.8080306@gmail.com>
	<20050503063140.A7719@familjen.svensson.org>
	<42776D71.6080303@gmail.com>
Message-ID: <2mhdhkz8aw.fsf@starship.python.net>

Nick Coghlan <ncoghlan at gmail.com> writes:

> Paul Svensson wrote:
>> On Tue, 3 May 2005, Nick Coghlan wrote:
>> 
>>> I'd also suggest that the blocktemplate decorator accept any iterator, 
>>> not just
>>> generators.
>> 
>> 
>> So you want decorators on classes now ?
>
> A decorator is just a function - it doesn't *need* to be used with decorator 
> syntax. I just think the following code should work for any iterator:
>
>    block blocktemplate(itr):
>      # Do stuff

But in 

@blocktemplate
def foo(...):
    ...

blocktemplate isn't passed an iterator, it's passed a callable that
returns an iterator.

Cheers,
mwh

-- 
    . <- the point                                your article -> .
    |------------------------- a long way ------------------------|
                                       -- Christophe Rhodes, ucam.chat

From tom-python-dev at rothamel.us  Tue May  3 17:05:10 2005
From: tom-python-dev at rothamel.us (Tom Rothamel)
Date: Tue, 3 May 2005 11:05:10 -0400
Subject: [Python-Dev] PEP 340: Breaking out.
Message-ID: <20050503150510.GA13595@onegeek.org>

I have a question/suggestion about PEP 340.

As I read the PEP right now, the code:


while True:

    block synchronized(v1):
         if v1.field:
              break

    time.sleep(1)


Will never break out of the enclosing while loop. This is because the
break breaks the while loop that the block statement is translated
into, instead of breaking the outer True statement. 

Am I understanding this right, or am I misunderstanding this?

If I am understanding this right, I would suggest allowing some way of
having the iterator call continue or break in the enclosing
context. (Perhaps by enclosing the entire translation of block in a
try-except construct, which catches Stop and Continue exceptions
raised by the generator and re-raises them in the outer context.)

I hope this helps.

-- 
Tom Rothamel ----------------------------------- http://www.rothamel.us/~tom/

From pierre.barbier at cirad.fr  Tue May  3 17:25:09 2005
From: pierre.barbier at cirad.fr (Pierre Barbier de Reuille)
Date: Tue, 03 May 2005 17:25:09 +0200
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <20050503150510.GA13595@onegeek.org>
References: <20050503150510.GA13595@onegeek.org>
Message-ID: <427797D5.8030207@cirad.fr>


Tom Rothamel a ?crit :
> I have a question/suggestion about PEP 340.
> 
> As I read the PEP right now, the code:
> 
> 
> while True:
> 
>     block synchronized(v1):
>          if v1.field:
>               break
> 
>     time.sleep(1)
> 
> 
> Will never break out of the enclosing while loop. This is because the
> break breaks the while loop that the block statement is translated
> into, instead of breaking the outer True statement. 

Well, that's exactly what it is intended to do and what I would expect 
it to do ! break/continue affect only the inner-most loop.

> 
> Am I understanding this right, or am I misunderstanding this?
> 
> If I am understanding this right, I would suggest allowing some way of
> having the iterator call continue or break in the enclosing
> context. (Perhaps by enclosing the entire translation of block in a
> try-except construct, which catches Stop and Continue exceptions
> raised by the generator and re-raises them in the outer context.)
> 
> I hope this helps.
> 

I don't want it like that ! This would differ with the break/continue 
used in other loops. If you need to break from many loops, enclose them 
into a function and return from it !

Pierre


-- 
Pierre Barbier de Reuille

INRA - UMR Cirad/Inra/Cnrs/Univ.MontpellierII AMAP
Botanique et Bio-informatique de l'Architecture des Plantes
TA40/PSII, Boulevard de la Lironde
34398 MONTPELLIER CEDEX 5, France

tel   : (33) 4 67 61 65 77    fax   : (33) 4 67 61 56 68

From skip at pobox.com  Tue May  3 17:30:53 2005
From: skip at pobox.com (Skip Montanaro)
Date: Tue, 3 May 2005 10:30:53 -0500
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <427797D5.8030207@cirad.fr>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
Message-ID: <17015.39213.522060.873605@montanaro.dyndns.org>

>>>>> "Pierre" == Pierre Barbier de Reuille <pierre.barbier at cirad.fr> writes:

    Pierre> Tom Rothamel a ?crit :
    >> I have a question/suggestion about PEP 340.
    >> 
    >> As I read the PEP right now, the code:
    >> 
    >> while True:
    >>     block synchronized(v1):
    >>	       if v1.field:
    >>	           break
    >>     time.sleep(1)
    >> 
    >> Will never break out of the enclosing while loop.

    Pierre> Well, that's exactly what it is intended to do and what I would
    Pierre> expect it to do ! break/continue affect only the inner-most
    Pierre> loop.

Yeah, but "block synchronized(v1)" doesn't look like a loop.  I think this
might be a common stumbling block for people using this construct.

Skip

From python at rcn.com  Tue May  3 17:41:52 2005
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 3 May 2005 11:41:52 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <2mhdhkz8aw.fsf@starship.python.net>
Message-ID: <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer>

I just made a first reading of the PEP and want to clarify my
understanding of how it fits with existing concepts.

Is it correct to say that "continue" parallel's its current meaning and
returns control upwards (?outwards) to the block iterator that called
it?

Likewise, is it correct that "yield" is anti-parallel to the current
meaning?  Inside a generator, it returns control upwards to the caller.
But inside a block-iterator, it pushes control downwards (?inwards) to
the block it controls.

Is the distinction between block iterators and generators similar to the
Gang-of-Four's distinction between external and internal iterators?

Are there some good use cases that do not involve resource locking?
IIRC, that same use case was listed a prime motivating example for
decorators (i.e. @syncronized).  TOOWTDI suggests that a single use case
should not be used to justify multiple, orthogonal control structures.  

It would be great if we could point to some code in the standard library
or in a major Python application that would be better (cleaner, faster,
or clearer) if re-written using blocks and block-iterators.  I've
scanned through the code base looking for some places to apply the idea
and have come up empty handed.  This could mean that I've not yet
grasped the essence of what makes the idea useful or it may have other
implications such as apps needing to be designed from the ground-up with
block iterators in mind.


Raymond

From pierre.barbier at cirad.fr  Tue May  3 17:43:36 2005
From: pierre.barbier at cirad.fr (Pierre Barbier de Reuille)
Date: Tue, 03 May 2005 17:43:36 +0200
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <17015.39213.522060.873605@montanaro.dyndns.org>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
Message-ID: <42779C28.5000207@cirad.fr>



Skip Montanaro a ?crit :
> [...]
> 
> Yeah, but "block synchronized(v1)" doesn't look like a loop.  I think this
> might be a common stumbling block for people using this construct.
> 
> Skip
> 

Well, this can be a problem, because indeed the black-statement 
introduce a new loop construct in Python. That's why I advocated some 
time ago against the introduction of a new name. IMHO, the for-loop 
syntax can be really used instead of blocks as its behavior if exactly 
the one of a for-loop if the iterator is an iterator-for-for and the 
current for-loop cannot be used with iterator-for-blocks.

The main problem with this syntax is the use of the blocks for things 
that are not loops (like the synchronize object)! And they are, indeed, 
quite common ! (or they will be :) ).

Pierre

-- 
Pierre Barbier de Reuille

INRA - UMR Cirad/Inra/Cnrs/Univ.MontpellierII AMAP
Botanique et Bio-informatique de l'Architecture des Plantes
TA40/PSII, Boulevard de la Lironde
34398 MONTPELLIER CEDEX 5, France

tel   : (33) 4 67 61 65 77    fax   : (33) 4 67 61 56 68

From ldlandis at gmail.com  Tue May  3 17:55:12 2005
From: ldlandis at gmail.com (LD "Gus" Landis)
Date: Tue, 3 May 2005 10:55:12 -0500
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer>
References: <2mhdhkz8aw.fsf@starship.python.net>
	<000b01c54ff6$a0fb4ca0$c704a044@oemcomputer>
Message-ID: <a1ddf57e05050308557088bad9@mail.gmail.com>

Hi,

  Sounds like a useful requirement to have for new features in 2.x,
  IMO.  that is... "demonstrated need".

  If the feature implies that the app needs to be designed from the
  ground up to *really* take advantage of the feature, then, maybe
  leave it for Guido's sabbatical (e.g. Python 3000).

On 5/3/05, Raymond Hettinger <python at rcn.com> wrote: 
 > It would be great if we could point to some code in the standard library
> or in a major Python application that would be better (cleaner, faster,
> or clearer) if re-written using blocks and block-iterators.  I've
> scanned through the code base looking for some places to apply the idea
> and have come up empty handed.  This could mean that I've not yet
> grasped the essence of what makes the idea useful or it may have other
> implications such as apps needing to be designed from the ground-up with
> block iterators in mind.
> 
> Raymond

-- 
LD Landis - N0YRQ - from the St Paul side of Minneapolis

From gvanrossum at gmail.com  Tue May  3 18:15:42 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 09:15:42 -0700
Subject: [Python-Dev] Need to hook Py_FatalError
In-Reply-To: <Xns964B980C943F1token@80.91.229.5>
References: <Xns964B980C943F1token@80.91.229.5>
Message-ID: <ca471dc205050309156962d3ff@mail.gmail.com>

> Currently Py_FatalError only dumps the error to stderr and calls abort().
> When doing quirky things with the interpreter, it's so annoying that process
> just terminates. Are there any reason why we still dont have a simple
> callback to hook Py_FatalError.
> 
> PS. If the answer is "because no one needs/implemented...", I can volunteer.

Your efforts would be better directed towards fixing the causes of the
fatal errors.

I see no need to hook Py_FatalError, but since it's open source, you
are of course free to patch your own copy if your urge is truly
irresistible. Or I guess you could run Python under supervision of gdb
and trap it that way.

But tell me, what do you want the process to do instead of
terminating? Py_FatalError is used in situations where raising an
exception is impossible or would do more harm than good.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From jcarlson at uci.edu  Tue May  3 18:14:28 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Tue, 03 May 2005 09:14:28 -0700
Subject: [Python-Dev] Need to hook Py_FatalError
In-Reply-To: <Xns964B980C943F1token@80.91.229.5>
References: <Xns964B980C943F1token@80.91.229.5>
Message-ID: <20050503090452.648C.JCARLSON@uci.edu>


"m.u.k" <m.u.k.2 at gawab.com> wrote:
> Currently Py_FatalError only dumps the error to stderr and calls abort().
> When doing quirky things with the interpreter, it's so annoying that process 
> just terminates. Are there any reason why we still dont have a simple 
> callback to hook Py_FatalError.
> 
> PS. If the answer is "because no one needs/implemented...", I can volunteer.

In looking at the use of Py_FatalError in the Python Sources (it's a 10
meg tarball that is well worth the download), it looks as though its use
shows a Fatal error (hence the name).  Things like "Inconsistant
interned string state" or "Immortal interned string died" or "Can't
initialize type", etc.

Essentially, those errors generally signify "the internal state of
python is messed up", whether that be by C extension, or even a bug in
Python.  The crucial observation is that many of them have ambiguous
possible recoveries.  How do you come back from "Can't initialize type",
or even 'gc couldn't allocate "__del__"'?


When you have individual solutions to some subset of the uses of
Py_FatalError, then it would make sense to offer those solutions as a
replacement to Py_FatalError use in those situations (also showing that
the errors are not actually fatal), rather than to ask for a hook to
hook all (by definition) fatal errors.


 - Josiah


From gvanrossum at gmail.com  Tue May  3 18:53:18 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 09:53:18 -0700
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer>
References: <2mhdhkz8aw.fsf@starship.python.net>
	<000b01c54ff6$a0fb4ca0$c704a044@oemcomputer>
Message-ID: <ca471dc2050503095336305c8b@mail.gmail.com>

[Raymond Hettinger]
> I just made a first reading of the PEP and want to clarify my
> understanding of how it fits with existing concepts.

Thanks! Now is about the right time -- all the loose ends are being
solidified (in my mind any way).

> Is it correct to say that "continue" parallel's its current meaning and
> returns control upwards (?outwards) to the block iterator that called
> it?

I have a hard time using directions as metaphors (maybe because on
some hardware, stacks grow down) unless you mean "up in the source
code" which doesn't make a lot of sense either in this context.

But yes, continue does what you expect it to do in a loop.

Of course, in a resource allocation block, continue and break are
pretty much the same (just as they are in any loop that you know has
only one iteration).

> Likewise, is it correct that "yield" is anti-parallel to the current
> meaning?  Inside a generator, it returns control upwards to the caller.
> But inside a block-iterator, it pushes control downwards (?inwards) to
> the block it controls.

I have a hard time visualizing the difference. They feel the same to
me, and the implementation (from the generator's POV) is identical:
yield suspends the current frame, returning to the previous frame from
the call to next() or __next__(), and the suspended frame can be
resumed by calling next() / __next__() again.

> Is the distinction between block iterators and generators similar to the
> Gang-of-Four's distinction between external and internal iterators?

I looked it up in the book (p. 260), and I think generators have a
duality to them that makes the distinction useless, or at least
relative to your POV. With a classic for-loop driven by a generator,
the author of the for-loop thinks of it as an external iterator -- you
ask for the next item using the (implicit) call to next(). But the
author of the generator thinks of it as an internal iterator -- the
for loop resumes only when the generator feels like it.

> Are there some good use cases that do not involve resource locking?
> IIRC, that same use case was listed a prime motivating example for
> decorators (i.e. @syncronized).  TOOWTDI suggests that a single use case
> should not be used to justify multiple, orthogonal control structures.

Decorators don't need @synchronized as a motivating use case; there
are plenty of other use cases.

Anyway, @synchronized was mostly a demonstration toy; whole method
calls are rarely the right granularity of locking. (BTW in the latest
version of PEP 340 I've renamed synchronized to locking; many people
complained about the strange Javaesque term.)

Look at the examples in the PEP (version 1.16) for more use cases.

> It would be great if we could point to some code in the standard library
> or in a major Python application that would be better (cleaner, faster,
> or clearer) if re-written using blocks and block-iterators.  I've
> scanned through the code base looking for some places to apply the idea
> and have come up empty handed.  This could mean that I've not yet
> grasped the essence of what makes the idea useful or it may have other
> implications such as apps needing to be designed from the ground-up with
> block iterators in mind.

I presume you mentally discarded the resource allocation use cases
where the try/finally statement was the outermost statement in the
function body, since those would be helped by @synchronized; but look
more closely at Queue, and you'll find that the two such methods use
different locks!

Also the use case for closing a file upon leaving a block, while
clearly a resource allocation use case, doesn't work well with a
decorator.

I just came across another use case that is fairly common in the
standard library: redirecting sys.stdout. This is just a beauty (in
fact I'll add it to the PEP):

def saving_stdout(f):
    save_stdout = sys.stdout
    try:
        sys.stdout = f
        yield
    finally:
        sys.stdout = save_stdout

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Tue May  3 19:13:53 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 10:13:53 -0700
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <17015.39213.522060.873605@montanaro.dyndns.org>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
Message-ID: <ca471dc20505031013287a2e92@mail.gmail.com>

[Skip Montanaro]
> Yeah, but "block synchronized(v1)" doesn't look like a loop.  I think this
> might be a common stumbling block for people using this construct.

How many try/finally statements have you written inside a loop? In my
experience this is extreeeemely rare. I found no occurrences in the
standard library.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Tue May  3 19:20:38 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 03 May 2005 13:20:38 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <ca471dc2050503095336305c8b@mail.gmail.com>
References: <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer>
	<2mhdhkz8aw.fsf@starship.python.net>
	<000b01c54ff6$a0fb4ca0$c704a044@oemcomputer>
Message-ID: <5.1.1.6.0.20050503130527.02471ad0@mail.telecommunity.com>

At 09:53 AM 5/3/05 -0700, Guido van Rossum wrote:
>I just came across another use case that is fairly common in the
>standard library: redirecting sys.stdout. This is just a beauty (in
>fact I'll add it to the PEP):
>
>def saving_stdout(f):

Very nice; may I suggest 'redirecting_stdout' as the name instead?

This and other examples from the PEP still have a certain awkwardness of 
phrasing in their names.  A lot of them seem to cry out for a "with" 
prefix, although maybe that's part of the heritage of PEP 310.  But Lisp 
has functions like 'with-open-file', so I don't think that it's *all* a PEP 
310 influence on the examples.

It also seems to me that it would be nice if locks, files, sockets and 
similar resources would implement the block-template protocol; then one 
could simply say:

      block self.__lock:
          ...

or:

      open("foo") as f:
          ...

And not need any special wrappers.  Of course, this could only work for 
files if the block-template protocol were distinct from the normal 
iteration protocol.


From aahz at pythoncraft.com  Tue May  3 19:31:32 2005
From: aahz at pythoncraft.com (Aahz)
Date: Tue, 3 May 2005 10:31:32 -0700
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <5.1.1.6.0.20050503130527.02471ad0@mail.telecommunity.com>
References: <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer>
	<2mhdhkz8aw.fsf@starship.python.net>
	<000b01c54ff6$a0fb4ca0$c704a044@oemcomputer>
	<5.1.1.6.0.20050503130527.02471ad0@mail.telecommunity.com>
Message-ID: <20050503173131.GA1375@panix.com>

On Tue, May 03, 2005, Phillip J. Eby wrote:
> At 09:53 AM 5/3/05 -0700, Guido van Rossum wrote:
>>
>>I just came across another use case that is fairly common in the
>>standard library: redirecting sys.stdout. This is just a beauty (in
>>fact I'll add it to the PEP):
>>
>>def saving_stdout(f):
> 
> Very nice; may I suggest 'redirecting_stdout' as the name instead?

You may; I'd nitpick that to either "redirect_stdout" or
"redirected_stdout".  "redirecting_stdout" is slightly longer and doesn't
have quite the right flavor to my eye.  I might even go for "make_stdout"
or "using_stdout"; that relies on people understanding that a block means
temporary usage.

> This and other examples from the PEP still have a certain awkwardness
> of phrasing in their names.  A lot of them seem to cry out for a
> "with" prefix, although maybe that's part of the heritage of PEP 310.
> But Lisp has functions like 'with-open-file', so I don't think that
> it's *all* a PEP 310 influence on the examples.

Yes, that's why I've been pushing for "with".
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"It's 106 miles to Chicago.  We have a full tank of gas, a half-pack of
cigarettes, it's dark, and we're wearing sunglasses."  "Hit it."

From foom at fuhm.net  Tue May  3 19:38:27 2005
From: foom at fuhm.net (James Y Knight)
Date: Tue, 3 May 2005 13:38:27 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <ca471dc2050503095336305c8b@mail.gmail.com>
References: <2mhdhkz8aw.fsf@starship.python.net>
	<000b01c54ff6$a0fb4ca0$c704a044@oemcomputer>
	<ca471dc2050503095336305c8b@mail.gmail.com>
Message-ID: <8dffe7626eaf9a89812c2828bcf96efe@fuhm.net>

On May 3, 2005, at 12:53 PM, Guido van Rossum wrote:
> def saving_stdout(f):
>     save_stdout = sys.stdout
>     try:
>         sys.stdout = f
>         yield
>     finally:
>         sys.stdout = save_stdout

I hope you aren't going to be using that in any threaded program. 
That's one really nice thing about lisp's dynamic variables: they 
automatically interact properly with threads.

(defvar *foo* nil)
(let ((*foo* 5))
   ; *foo* has value of 5 for all functions called from here, but only 
in this thread. In other threads it'll still be nil.
)
; *foo* has gone back to nil.

James


From m.u.k.2 at gawab.com  Tue May  3 19:44:10 2005
From: m.u.k.2 at gawab.com (m.u.k)
Date: Tue, 3 May 2005 17:44:10 +0000 (UTC)
Subject: [Python-Dev] Need to hook Py_FatalError
References: <Xns964B980C943F1token@80.91.229.5>
	<ca471dc205050309156962d3ff@mail.gmail.com>
Message-ID: <Xns964BD56D743F2token@80.91.229.5>

Hi,

Guido van Rossum <gvanrossum at gmail.com> wrote in
news:ca471dc205050309156962d3ff at mail.gmail.com: 
 
> Your efforts would be better directed towards fixing the causes of the
> fatal errors.
>
> I see no need to hook Py_FatalError, but since it's open source, you
> are of course free to patch your own copy if your urge is truly
> irresistible. Or I guess you could run Python under supervision of gdb
> and trap it that way.

Well, I admit it is a bit triva(as its implementation), at least nobody 
wanted it within Python's 10+ lifetime. Indeed Im using my own patched copy,
I just thought it'd be good some other naughty boy playing dangerous games 
with interpreter internals not spend hours in debugger trying to reproduce 
the crash.

> But tell me, what do you want the process to do instead of
> terminating? Py_FatalError is used in situations where raising an
> exception is impossible or would do more harm than good.

The need for this is only logging purposes. eg the process just terminates 
on client machine, you have no logs, no clues(except a coredump), nightmare!.
Some sort of log would be invaluable here.


Best regards.


From jepler at unpythonic.net  Tue May  3 19:54:23 2005
From: jepler at unpythonic.net (Jeff Epler)
Date: Tue, 3 May 2005 12:54:23 -0500
Subject: [Python-Dev] Need to hook Py_FatalError
In-Reply-To: <ca471dc205050309156962d3ff@mail.gmail.com>
References: <Xns964B980C943F1token@80.91.229.5>
	<ca471dc205050309156962d3ff@mail.gmail.com>
Message-ID: <20050503175422.GF8344@unpythonic.net>

On Tue, May 03, 2005 at 09:15:42AM -0700, Guido van Rossum wrote:
> But tell me, what do you want the process to do instead of
> terminating? Py_FatalError is used in situations where raising an
> exception is impossible or would do more harm than good.

In an application which embeds Python, I want to show the application's
standard error dialog, which doesn't call any Python APIs (but does do
things like capture the call stack at the time of the error).  For this
use, it doesn't matter that no further calls to those APIs are possible.

Jeff
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: not available
Url : http://mail.python.org/pipermail/python-dev/attachments/20050503/62c5ddea/attachment.pgp

From m.u.k.2 at gawab.com  Tue May  3 19:47:57 2005
From: m.u.k.2 at gawab.com (m.u.k)
Date: Tue, 3 May 2005 17:47:57 +0000 (UTC)
Subject: [Python-Dev] Need to hook Py_FatalError
References: <Xns964B980C943F1token@80.91.229.5>
	<20050503090452.648C.JCARLSON@uci.edu>
Message-ID: <Xns964BD6113E2E7token@80.91.229.5>

Hi,

Josiah Carlson <jcarlson at uci.edu> wrote in
news:20050503090452.648C.JCARLSON at uci.edu: 

> In looking at the use of Py_FatalError in the Python Sources (it's a 10
> meg tarball that is well worth the download), it looks as though its use
> shows a Fatal error (hence the name).  Things like "Inconsistant
> interned string state" or "Immortal interned string died" or "Can't
> initialize type", etc.
> 
> Essentially, those errors generally signify "the internal state of
> python is messed up", whether that be by C extension, or even a bug in
> Python.  The crucial observation is that many of them have ambiguous
> possible recoveries.  How do you come back from "Can't initialize type",
> or even 'gc couldn't allocate "__del__"'?

The hook is not to come back just for logging, see my previous post please.

Best regards.





From skip at pobox.com  Tue May  3 20:11:10 2005
From: skip at pobox.com (Skip Montanaro)
Date: Tue, 3 May 2005 13:11:10 -0500
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <ca471dc20505031013287a2e92@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
Message-ID: <17015.48830.223391.390538@montanaro.dyndns.org>

>>>>> "Guido" == Guido van Rossum <gvanrossum at gmail.com> writes:

    Guido> [Skip Montanaro]
    >> Yeah, but "block synchronized(v1)" doesn't look like a loop.  I think
    >> this might be a common stumbling block for people using this
    >> construct.

    Guido> How many try/finally statements have you written inside a loop?
    Guido> In my experience this is extreeeemely rare. I found no
    Guido> occurrences in the standard library.

How'd we start talking about try/finally?  To the casual observer, this
looks like "break" should break out of the loop:

    while True:
        block synchronized(v1):
            ...
            if v1.field:
                break
        time.sleep(1)

The PEP says:

    Note that it is left in the middle whether a block-statement
    represents a loop or not; this is up to the iterator, but in the
    most common case BLOCK1 is executed exactly once.

That suggests to me it's still not clear if the block statement is actually
a looping statement.  If not, then "break" should almost certainly break out
of the while loop.

BTW, what did you mean by "left in the middle" mean?  I interpreted it as
"still undecided", but it's an idiom I've never seen.  Perhaps it should be
replaced by something more clear.

Skip

From python at rcn.com  Tue May  3 20:26:05 2005
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 3 May 2005 14:26:05 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <ca471dc2050503095336305c8b@mail.gmail.com>
Message-ID: <001101c5500d$a7be7140$c704a044@oemcomputer>

[Raymond]
> > Likewise, is it correct that "yield" is anti-parallel to the current
> > meaning?  Inside a generator, it returns control upwards to the
caller.
> > But inside a block-iterator, it pushes control downwards (?inwards)
to
> > the block it controls.

[Guido van Rossum]
> I have a hard time visualizing the difference. They feel the same to
> me, and the implementation (from the generator's POV) is identical:
> yield suspends the current frame, returning to the previous frame from
> the call to next() or __next__(), and the suspended frame can be
> resumed by calling next() / __next__() again.

This concept ought to be highlighted in the PEP because it explains
clearly what "yield" does and it may help transition from a non-Dutch
mental model.  I expect that many folks (me included) think in terms of
caller vs callee with a parallel spatial concept of enclosing vs
enclosed.  In that model, the keywords "continue", "break", "yield", and
"return" all imply a control transfer from the enclosed back to the
encloser.  

In contrast, the new use of yield differs in that the suspended frame
transfers control from the encloser to the enclosed. 



> > Are there some good use cases that do not involve resource locking?
> > IIRC, that same use case was listed a prime motivating example for
> > decorators (i.e. @syncronized).  TOOWTDI suggests that a single use
case
> > should not be used to justify multiple, orthogonal control
structures.
> 
> Decorators don't need @synchronized as a motivating use case; there
> are plenty of other use cases.

No doubt about that.



> Anyway, @synchronized was mostly a demonstration toy; whole method
> calls are rarely the right granularity of locking. 

Agreed.  Since that is the case, there should be some effort to shift
some of the examples towards real use cases where a block-iterator is
the appropriate solution.  It need not hold-up releasing the PEP to
comp.lang.python, but it would go a long way towards improving the
quality of the subsequent discussion.



> (BTW in the latest
> version of PEP 340 I've renamed synchronized to locking; many people
> complained about the strange Javaesque term.)

That was diplomatic.  Personally, I find it amusing when there is an
early focus on naming rather than on functionality, implementation
issues, use cases, usability, and goodness-of-fit within the language.



> > It would be great if we could point to some code in the standard
library
> > or in a major Python application that would be better (cleaner,
faster,
> > or clearer) if re-written using blocks and block-iterators

> look
> more closely at Queue, and you'll find that the two such methods use
> different locks!

I don't follow this one.   Tim's uses of not_empty and not_full are
orthogonal (pertaining to pending gets at one end of the queue and to
pending puts at the other end).  The other use of the mutex is
independent of either pending puts or gets; instead, it is a weak
attempt to minimize what can happen to the queue during a size query.

While the try/finallys could get factored-out into separate blocks, I do
not see how the code could be considered better off.  There is a slight
worsening of all metrics of merit:   line counts, total number of
function defs, number of calls, or number of steps executed outside the
lock (important given that the value a query result declines rapidly
once the lock is released).


 
> Also the use case for closing a file upon leaving a block, while
> clearly a resource allocation use case, doesn't work well with a
> decorator.

Right. 



> I just came across another use case that is fairly common in the
> standard library: redirecting sys.stdout. This is just a beauty (in
> fact I'll add it to the PEP):
> 
> def saving_stdout(f):
>     save_stdout = sys.stdout
>     try:
>         sys.stdout = f
>         yield
>     finally:
>         sys.stdout = save_stdout

This is the strongest example so far.  When adding it to the PEP, it
would be useful to contrast the code with simpler alternatives like PEP
288's g.throw() or PEP 325's g.close().  On the plus side, the
block-iterator approach factors out code common to multiple callers.  On
the minus side, the other PEPs involve simpler mechanisms and their
learning curve would be nearly zero.  These pluses and minuses are
important because apply equally to all examples using blocks for
initialization/finalization.



Raymond

From gvanrossum at gmail.com  Tue May  3 20:31:55 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 11:31:55 -0700
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <17015.48830.223391.390538@montanaro.dyndns.org>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
Message-ID: <ca471dc2050503113127f938b0@mail.gmail.com>

[Skip Montanaro]
>     >> Yeah, but "block synchronized(v1)" doesn't look like a loop.  I think
>     >> this might be a common stumbling block for people using this
>     >> construct.
> 
>     Guido> How many try/finally statements have you written inside a loop?
>     Guido> In my experience this is extreeeemely rare. I found no
>     Guido> occurrences in the standard library.

[Skip again]
> How'd we start talking about try/finally?

Because it provides by far the dominant use case for 'block'. The
block-statement is intended to replace many boilerplace uses of
try/finally. In addition, it's also a coroutine invocation primitive.

> To the casual observer, this
> looks like "break" should break out of the loop:
> 
>     while True:
>         block synchronized(v1):
>             ...
>             if v1.field:
>                 break
>         time.sleep(1)

Without 'block' this would be written as try/finally. And my point is
that people just don't write try/finally inside a while loop very
often (I found *no* examples in the entire standard library).

> The PEP says:
> 
>     Note that it is left in the middle whether a block-statement
>     represents a loop or not; this is up to the iterator, but in the
>     most common case BLOCK1 is executed exactly once.
> 
> That suggests to me it's still not clear if the block statement is actually
> a looping statement.  If not, then "break" should almost certainly break out
> of the while loop.

Dynamically, it's most likely not a loop. But the compiler doesn't
know that, so the compiler considers it a loop.

> BTW, what did you mean by "left in the middle" mean?  I interpreted it as
> "still undecided", but it's an idiom I've never seen.  Perhaps it should be
> replaced by something more clear.

It may be a Dutch phrase that doesn't translate to English as wel as I
thought. It doesn't exactly mean "still undecided" but more "depends
on your POV". I'll use something different, and also clarify that as
far as break/continue are concerned, it *is* a loop.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From jcarlson at uci.edu  Tue May  3 20:34:52 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Tue, 03 May 2005 11:34:52 -0700
Subject: [Python-Dev] Need to hook Py_FatalError
In-Reply-To: <Xns964BD56D743F2token@80.91.229.5>
References: <ca471dc205050309156962d3ff@mail.gmail.com>
	<Xns964BD56D743F2token@80.91.229.5>
Message-ID: <20050503112637.648F.JCARLSON@uci.edu>


"m.u.k" <m.u.k.2 at gawab.com> wrote:
> 
> Hi,
> 
> Guido van Rossum <gvanrossum at gmail.com> wrote in
> news:ca471dc205050309156962d3ff at mail.gmail.com: 
>  
> > Your efforts would be better directed towards fixing the causes of the
> > fatal errors.
> >
> > I see no need to hook Py_FatalError, but since it's open source, you
> > are of course free to patch your own copy if your urge is truly
> > irresistible. Or I guess you could run Python under supervision of gdb
> > and trap it that way.
> 
> Well, I admit it is a bit triva(as its implementation), at least nobody 
> wanted it within Python's 10+ lifetime. Indeed Im using my own patched copy,
> I just thought it'd be good some other naughty boy playing dangerous games 
> with interpreter internals not spend hours in debugger trying to reproduce 
> the crash.
> 
> > But tell me, what do you want the process to do instead of
> > terminating? Py_FatalError is used in situations where raising an
> > exception is impossible or would do more harm than good.
> 
> The need for this is only logging purposes. eg the process just terminates 
> on client machine, you have no logs, no clues(except a coredump), nightmare!.
> Some sort of log would be invaluable here.

Offering any hook for Py_FatalError may not even be enough, as some of
those errors are caused by insufficient memory.  What if a hook were
available, but it couldn't be called because there wasn't enough memory?

Of course there is the option of pre-allocating a few kilobytes, then
just before one calls the hook, freeing that memory so that the hook can
execute (assuming the hook is small enough).  I'm not sure if this is a
desireable general mechanic, but it may be sufficient for you.  If you
do figure out a logging mechanism that is almost guaranteed to execute
on FatalError, post it to sourceforge.


 - Josiah


From gvanrossum at gmail.com  Tue May  3 20:48:09 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 11:48:09 -0700
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <001101c5500d$a7be7140$c704a044@oemcomputer>
References: <ca471dc2050503095336305c8b@mail.gmail.com>
	<001101c5500d$a7be7140$c704a044@oemcomputer>
Message-ID: <ca471dc205050311481385b64f@mail.gmail.com>

> [Raymond]
> > > Likewise, is it correct that "yield" is anti-parallel to the current
> > > meaning?  Inside a generator, it returns control upwards to the caller.
> > > But inside a block-iterator, it pushes control downwards (?inwards) to
> > > the block it controls.
> 
> [Guido van Rossum]
> > I have a hard time visualizing the difference. They feel the same to
> > me, and the implementation (from the generator's POV) is identical:
> > yield suspends the current frame, returning to the previous frame from
> > the call to next() or __next__(), and the suspended frame can be
> > resumed by calling next() / __next__() again.

[Raymond]
> This concept ought to be highlighted in the PEP because it explains
> clearly what "yield" does and it may help transition from a non-Dutch
> mental model.  I expect that many folks (me included) think in terms of
> caller vs callee with a parallel spatial concept of enclosing vs
> enclosed.  In that model, the keywords "continue", "break", "yield", and
> "return" all imply a control transfer from the enclosed back to the
> encloser.

I'm still confused and surprised that you think I need to explain what
yield does, since the PEP doesn't change one bit about this.

The encloser/enclosed parallel to caller/callee doesn't make sense to
me; but that may just because I'm Dutch.

> In contrast, the new use of yield differs in that the suspended frame
> transfers control from the encloser to the enclosed.

Why does your notion of who encloses whom suddenly reverse when you go
from a for-loop to a block-statement? This all feels very strange to
me.

> > Anyway, @synchronized was mostly a demonstration toy; whole method
> > calls are rarely the right granularity of locking.
> 
> Agreed.  Since that is the case, there should be some effort to shift
> some of the examples towards real use cases where a block-iterator is
> the appropriate solution.  It need not hold-up releasing the PEP to
> comp.lang.python, but it would go a long way towards improving the
> quality of the subsequent discussion.

Um? I thought I just showed that locking *is* a good use case for the
block-statement and you agreed; now why would I have to move away from
it?

I think I'm thoroughly confused by your critique of the PEP. Perhaps
you could suggest some concrete rewritings to knock me out of my
confusion?

> Personally, I find it amusing when there is an
> early focus on naming rather than on functionality, implementation
> issues, use cases, usability, and goodness-of-fit within the language.

Well, the name of a proposed concept does a lot to establish its first
impression. First imressions matter!

> > > It would be great if we could point to some code in the standard library
> > > or in a major Python application that would be better (cleaner, faster,
> > > or clearer) if re-written using blocks and block-iterators
> 
> > look
> > more closely at Queue, and you'll find that the two such methods use
> > different locks!
> 
> I don't follow this one.   Tim's uses of not_empty and not_full are
> orthogonal (pertaining to pending gets at one end of the queue and to
> pending puts at the other end).  The other use of the mutex is
> independent of either pending puts or gets; instead, it is a weak
> attempt to minimize what can happen to the queue during a size query.

I meant to use this as an example of the unsuitability of the
@synchronized decorator, since it implies that all synchronization is
on the same mutex, thereby providing a use case for the locking
block-statement.

I suspect we're violently in agreement though.

> While the try/finallys could get factored-out into separate blocks, I do
> not see how the code could be considered better off.  There is a slight
> worsening of all metrics of merit:   line counts, total number of
> function defs, number of calls, or number of steps executed outside the
> lock (important given that the value a query result declines rapidly
> once the lock is released).

I don't see how the line count metric would lose: a single "locking()"
primitive exported by the threading module would be usable by all code
that currently uses try/finally to acquire and release a lock.
Performance needn't suffer either, if the locking() primitive is
implemented in C (it could be a straightforward translation of example
6 into C).

> > I just came across another use case that is fairly common in the
> > standard library: redirecting sys.stdout. This is just a beauty (in
> > fact I'll add it to the PEP):
> >
> > def saving_stdout(f):
> >     save_stdout = sys.stdout
> >     try:
> >         sys.stdout = f
> >         yield
> >     finally:
> >         sys.stdout = save_stdout
> 
> This is the strongest example so far.  When adding it to the PEP, it
> would be useful to contrast the code with simpler alternatives like PEP
> 288's g.throw() or PEP 325's g.close().  On the plus side, the
> block-iterator approach factors out code common to multiple callers.  On
> the minus side, the other PEPs involve simpler mechanisms and their
> learning curve would be nearly zero.  These pluses and minuses are
> important because apply equally to all examples using blocks for
> initialization/finalization.

Where do you see a learning curve for blocks?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From m.u.k.2 at gawab.com  Tue May  3 20:58:40 2005
From: m.u.k.2 at gawab.com (m.u.k)
Date: Tue, 3 May 2005 18:58:40 +0000 (UTC)
Subject: [Python-Dev] Need to hook Py_FatalError
References: <ca471dc205050309156962d3ff@mail.gmail.com>
	<Xns964BD56D743F2token@80.91.229.5>
	<20050503112637.648F.JCARLSON@uci.edu>
Message-ID: <Xns964BE20F692E9token@80.91.229.5>

Hi,

Josiah Carlson <jcarlson at uci.edu> wrote in
news:20050503112637.648F.JCARLSON at uci.edu: 

> Offering any hook for Py_FatalError may not even be enough, as some of
> those errors are caused by insufficient memory.  What if a hook were
> available, but it couldn't be called because there wasn't enough memory?
> 
> Of course there is the option of pre-allocating a few kilobytes, then
> just before one calls the hook, freeing that memory so that the hook can
> execute (assuming the hook is small enough).  I'm not sure if this is a
> desireable general mechanic, but it may be sufficient for you.  If you
> do figure out a logging mechanism that is almost guaranteed to execute
> on FatalError, post it to sourceforge.
 
IMHO this should be left to hooker(apparerently not right word, but you get 
the point :) ). If he allocates more mem. or does heavy stuff, that will just 
fail. Anyway abort() is a failure too. Either abort() will end the process or 
OS will on such a critical error.

Best regards.


From jimjjewett at gmail.com  Tue May  3 21:07:37 2005
From: jimjjewett at gmail.com (Jim Jewett)
Date: Tue, 3 May 2005 15:07:37 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
Message-ID: <fb6fbf560505031207e6e8ff@mail.gmail.com>

[Raymond Hettinger]
>> Likewise, is it correct that "yield" is anti-parallel to the current
>> meaning?  Inside a generator, it returns control upwards to the caller.
>> But inside a block-iterator, it pushes control downwards (?inwards) to
>> the block it controls.

Guido:
> I have a hard time visualizing the difference.

In a normal generator, someone makes a call to establish the 
generator, which then becomes a little island -- anyone can call 
the generator, and it returns control back to whoever made the last call.

With the block, every yield returns to a single designated callback.
This callback had to be established at the same time the block was
created, and must be textually inside it.  (An indented suite to the 
"block XXX:" line.)

>> Are there some good use cases that do not involve resource locking?

> Decorators don't need @synchronized as a motivating use case; 
> there are plenty of other use cases.

But are there plenty of other use cases for PEP 340?

If not, then why do we need PEP 340?  Are decorators not strong
enough, or is it just that people aren't comfortable yet?  If it is a
matter of comfort or recipies, then the new construct might have
just as much trouble.  (So this one is not a loop, and you can tell
the difference because ... uh, just skip that advanced stuff.)

> Anyway, @synchronized was mostly a demonstration toy; whole
> method calls are rarely the right granularity of locking.

That is an important difference -- though I'm not sure that the critical
part *shouldn't* be broken out into a separate method.

>> I've scanned through the code base looking for some places
>> to apply the idea and have come up empty handed.

> I presume you mentally discarded the resource allocation use
> cases where the try/finally statement was the outermost statement
> in the function body, since those would be helped by @synchronized;
> but look more closely at Queue, and you'll find that the two such
> methods use different locks!

qsize, empty, and full could be done with a lockself decorator.
Effectively, they *are* lockself decorators for the _xxx functions 
that subclasses are told to override.

If you're talking about put and get, decorators don't help as much,
but I'm not sure blocks are much better.  

You can't replace the outermost try ... finally with a common decorator 
because the locks are self variables.  A block, by being inside a method,
would be delayed until self exists -- but that outer lock is only a
tiny fraction
of the boilerplate.  It doesn't help with

            if not block:
                if self._STATE():
                    raise STATEException
            elif timeout is None:
                while self._STATE():
                    self.not_STATE.wait()
            else:
                if timeout < 0:
                    raise ValueError("'timeout' must be a positive number")
                endtime = _time() + timeout
                while self._STATE():
                    remaining = endtime - _time()
                    if remaining <= 0.0:
                        raise STATEException
                    self.not_STATE.wait(remaining)
            val = self._RealMethod(item)  # OK, the put optimizes out
this and the return
            self.not_OTHERSTATE.notify()
            return val

I wouldn't object to a helper method, but using a block just to get rid of four
lines (two of which are the literals "try:" and "finally:") seems barely worth
doing, let alone with special new syntax.

> Also the use case for closing a file upon leaving a block, while
> clearly a resource allocation use case, doesn't work well with a
> decorator.

def autoclose(fn):
    def outer(filename, *args, **kwargs):
        f = open(filename)
        val = fn(f, *args, **kwargs)
        f.close()
        return val
    return outer

@autoclose
def f1(f):
    for line in f:
        print line

> I just came across another use case that is fairly common in the
> standard library: redirecting sys.stdout. This is just a beauty (in
> fact I'll add it to the PEP):

> def saving_stdout(f):
>     save_stdout = sys.stdout
>     try:
>         sys.stdout = f
>         yield
>     finally:
>         sys.stdout = save_stdout

Why does this need a yield?  Why not just a regular call to the
function?  If you're trying to generalize the redirector, then this
also works as a decorator.  The nested functions (and the *args,
**kwargs, if you don't inherit from a standard dedcorator) is a
bit of an annoyance, but I'm not sure the new iterator form will
be any easier to explain.

def saving_stdout(f):
    import sys   # Just in case...
    def captured_stream(fn):
        def redirect(*args, **kwargs):
            save_stdout = sys.stdout
            try:
                sys.stdout = f
                return fn (*args, **kwargs)
            finally:
                sys.stdout = save_stdout
        return redirect
    return captured_stream

o=StringIO()
@saving_stdout(o)
...

From tim.peters at gmail.com  Tue May  3 21:13:52 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Tue, 3 May 2005 15:13:52 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <ca471dc205050311481385b64f@mail.gmail.com>
References: <ca471dc2050503095336305c8b@mail.gmail.com>
	<001101c5500d$a7be7140$c704a044@oemcomputer>
	<ca471dc205050311481385b64f@mail.gmail.com>
Message-ID: <1f7befae050503121346833d97@mail.gmail.com>

[Raymond]
>>>> It would be great if we could point to some code in the standard library
>>>> or in a major Python application that would be better (cleaner, faster,
>>>> or clearer) if re-written using blocks and block-iterators

[Guido]
>>> look more closely at Queue, and you'll find that the two such methods
>>> use different locks!

[Raymond]
>> I don't follow this one.   Tim's uses of not_empty and not_full are
>> orthogonal (pertaining to pending gets at one end of the queue and to
>> pending puts at the other end).  The other use of the mutex is
>> independent of either pending puts or gets; instead, it is a weak
>> attempt to minimize what can happen to the queue during a size query.
 
[Guido]
> I meant to use this as an example of the unsuitability of the
> @synchronized decorator, since it implies that all synchronization is
> on the same mutex, thereby providing a use case for the locking
> block-statement.

Queue may be a confusing example.  Older versions of Queue did indeed
use more than one mutex.  The _current_ (2.4+) version of Queue uses
only one mutex, but shared across two condition variables (`not_empty`
and `not_full` are condvars in current Queue, not locks).  Where,
e.g., current Queue.put() starts with

        self.not_full.acquire()

it _could_ say

        self.not_empty.acquire()

instead with the same semantics, or it could say

        self.mutex.acquire()

They all do an acquire() on the same mutex.  If put() needs to wait,
it needs to wait on the not_full condvar, so it's conceptually
clearest for put() to spell it the first of these ways.

Because Queue does use condvars now instead of plain locks, I wouldn't
approve of any gimmick purporting to hide the acquire/release's in
put() or get():  that those are visible is necessary to seeing that
the _condvar_ protocol is being followed ("must acquire() before
wait(); must be acquire()'ed during notify(); no path should leave the
condvar acquire()d 'for a long time' before a wait() or release()").

From gvanrossum at gmail.com  Tue May  3 21:40:11 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 12:40:11 -0700
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <fb6fbf560505031207e6e8ff@mail.gmail.com>
References: <fb6fbf560505031207e6e8ff@mail.gmail.com>
Message-ID: <ca471dc20505031240c7ffc2b@mail.gmail.com>

> [Raymond Hettinger]
> >> Likewise, is it correct that "yield" is anti-parallel to the current
> >> meaning?  Inside a generator, it returns control upwards to the caller.
> >> But inside a block-iterator, it pushes control downwards (?inwards) to
> >> the block it controls.
> 
[Guido]
> > I have a hard time visualizing the difference.

[Jim Jewett]
> In a normal generator, someone makes a call to establish the
> generator, which then becomes a little island -- anyone can call
> the generator, and it returns control back to whoever made the last call.
> 
> With the block, every yield returns to a single designated callback.
> This callback had to be established at the same time the block was
> created, and must be textually inside it.  (An indented suite to the
> "block XXX:" line.)

Doesn't convince me. The common use for a regular generator is in a
for-loop, where every yield also returns to a single designated place
(calling it callback is really deceptive!).

And with a block, you're free to put the generator call ahead of the
block so you can call next() on it manually:

    it = EXPR1
    block it:
        BLOCK1

is totally equivalent to

    block EXPR1:
        BLOCK1

but the first form lets you call next() on it as you please (until the
block is exited, for sure).

> But are there plenty of other use cases for PEP 340?

Yes. Patterns like "do this little dance in a try/finally block" and
"perform this tune when you catch an XYZ exception" are pretty common
in larger systems and are effectively abstracted away using the
block-statement and an appropriate iterator. The try/finally use case
often also has some setup that needs to go right before the try (and
sometimes some more setup that needs to go *inside* the try). Being
able to write this once makes it a lot easier when the "little dance"
has to be changed everywhere it is performed.

> If not, then why do we need PEP 340?  Are decorators not strong
> enough, or is it just that people aren't comfortable yet?  If it is a
> matter of comfort or recipies, then the new construct might have
> just as much trouble.  (So this one is not a loop, and you can tell
> the difference because ... uh, just skip that advanced stuff.)

PEP 340 and decorators are totally different things, and the only
vaguely common use case would be @synchronized, which is *not* a
proper use case for decorators, but "safe locking" is definitely a use
case for PEP 340.

> > Anyway, @synchronized was mostly a demonstration toy; whole
> > method calls are rarely the right granularity of locking.
> 
> That is an important difference -- though I'm not sure that the critical
> part *shouldn't* be broken out into a separate method.

I'll be the judge of that. I have plenty of examples where breaking it
out would create an entirely artificial helper method that takes
several arguments just because it needs to use stuff that its caller
has set up for it.

> > I presume you mentally discarded the resource allocation use
> > cases where the try/finally statement was the outermost statement
> > in the function body, since those would be helped by @synchronized;
> > but look more closely at Queue, and you'll find that the two such
> > methods use different locks!
> 
> qsize, empty, and full could be done with a lockself decorator.
> Effectively, they *are* lockself decorators for the _xxx functions
> that subclasses are told to override.

Actually you're pointing out a bug in the Queue module: these *should*
be using a try/finally clause to ensure the mutex is released even if
the inner call raises an exception. I hadn't noticed these before
because I was scanning only for "finally".

If a locking primitive had been available, I'm sure it would have been
used here.

> If you're talking about put and get, decorators don't help as much,
> but I'm not sure blocks are much better.
> 
> You can't replace the outermost try ... finally with a common decorator
> because the locks are self variables.  A block, by being inside a method,
> would be delayed until self exists -- but that outer lock is only a
> tiny fraction of the boilerplate.  It doesn't help with
> [...example deleted...]
> I wouldn't object to a helper method, but using a block just to get rid of four
> lines (two of which are the literals "try:" and "finally:") seems barely worth
> doing, let alone with special new syntax.

Well, to me it does; people have been requesting new syntax for this
specific case for a long time (that's where PEP 310 is coming from).

> > Also the use case for closing a file upon leaving a block, while
> > clearly a resource allocation use case, doesn't work well with a
> > decorator.
> 
> def autoclose(fn):
>     def outer(filename, *args, **kwargs):
>         f = open(filename)
>         val = fn(f, *args, **kwargs)
>         f.close()
>         return val
>     return outer
> 
> @autoclose
> def f1(f):
>     for line in f:
>         print line

But the auto-closing file, even more than the self-releasing lock,
most often occurs in the middle of some code that would be unnatural
to turn into a helper method just so that you can use a decorator
pattern. In fact your example is so confusing that I can't figure out
whether it has a bug or whether I'm just confused. This is *not* a
good use case for decorators.

> > I just came across another use case that is fairly common in the
> > standard library: redirecting sys.stdout. This is just a beauty (in
> > fact I'll add it to the PEP):
> 
> > def saving_stdout(f):
> >     save_stdout = sys.stdout
> >     try:
> >         sys.stdout = f
> >         yield
> >     finally:
> >         sys.stdout = save_stdout
> 
> Why does this need a yield?  Why not just a regular call to the
> function?

Because PEP 340 uses yield to pass control to the body of the
block-statement. (I have to resist the urge to add, ", dummy!" :-)

I can't tell whether you have totally not grasped PEP 340, or you are
proposing to solve all its use cases by defining an explicit function
or method representing the body of the block. The latter solution
leads to way too much ugly code -- all that function-definition
boilerplate is worse than the try/finally boilerplate we're trying to
hide!

> If you're trying to generalize the redirector, then this
> also works as a decorator.  The nested functions (and the *args,
> **kwargs, if you don't inherit from a standard dedcorator) is a
> bit of an annoyance, but I'm not sure the new iterator form will
> be any easier to explain.
> 
> def saving_stdout(f):
>     import sys   # Just in case...
>     def captured_stream(fn):
>         def redirect(*args, **kwargs):
>             save_stdout = sys.stdout
>             try:
>                 sys.stdout = f
>                 return fn (*args, **kwargs)
>             finally:
>                 sys.stdout = save_stdout
>         return redirect
>     return captured_stream
> 
> o=StringIO()
> @saving_stdout(o)
> ...

This has absolutely nothing to recommend it.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Tue May  3 21:48:05 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 12:48:05 -0700
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <1f7befae050503121346833d97@mail.gmail.com>
References: <ca471dc2050503095336305c8b@mail.gmail.com>
	<001101c5500d$a7be7140$c704a044@oemcomputer>
	<ca471dc205050311481385b64f@mail.gmail.com>
	<1f7befae050503121346833d97@mail.gmail.com>
Message-ID: <ca471dc205050312485bde01fe@mail.gmail.com>

[Tim]
> Because Queue does use condvars now instead of plain locks, I wouldn't
> approve of any gimmick purporting to hide the acquire/release's in
> put() or get():  that those are visible is necessary to seeing that
> the _condvar_ protocol is being followed ("must acquire() before
> wait(); must be acquire()'ed during notify(); no path should leave the
> condvar acquire()d 'for a long time' before a wait() or release()").

So you think that this would be obscure? A generic condition variable
use could look like this:

    block locking(self.condvar):
        while not self.items:
            self.condvar.wait()
        self.process(self.items)
        self.items = []

instead of this:

    self.condvar.acquire()
    try:
        while not self.items:
            self.condvar.wait()
        self.process(self.items)
        self.items = []
    finally:
        self.condvar.release()

I find that the "block locking" version looks just fine; it makes the
scope of the condition variable quite clear despite not having any
explicit acquire() or release() calls (there are some abstracted away
in the wait() call too!).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From tim.peters at gmail.com  Tue May  3 22:10:42 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Tue, 3 May 2005 16:10:42 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <ca471dc20505031240c7ffc2b@mail.gmail.com>
References: <fb6fbf560505031207e6e8ff@mail.gmail.com>
	<ca471dc20505031240c7ffc2b@mail.gmail.com>
Message-ID: <1f7befae0505031310200564c6@mail.gmail.com>

...

[Jim Jewett]
>> qsize, empty, and full could be done with a lockself decorator.
>> Effectively, they *are* lockself decorators for the _xxx functions
>> that subclasses are told to override.

[Guido] 
> Actually you're pointing out a bug in the Queue module: these *should*
> be using a try/finally clause to ensure the mutex is released even if
> the inner call raises an exception.

Yup!  OTOH, if those dead-simple methods raised an exception, the
Queue has probably gone wholly insane anyway.

> I hadn't noticed these before because I was scanning only for "finally"
>
> If a locking primitive had been available, I'm sure it would have been
> used here.

That too.

From hpk at trillke.net  Tue May  3 22:14:00 2005
From: hpk at trillke.net (holger krekel)
Date: Tue, 3 May 2005 22:14:00 +0200
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <ca471dc20505021755518773c8@mail.gmail.com>
References: <ca471dc20505021755518773c8@mail.gmail.com>
Message-ID: <20050503201400.GE30548@solar.trillke.net>

Hi Guido, 

On Mon, May 02, 2005 at 17:55 -0700, Guido van Rossum wrote:
> These are the loose ends on the PEP (apart from filling in some
> missing sections):
> 
> 1. Decide on a keyword to use, if any.

I just read the PEP340 basically the first time so bear with me. 

First i note that introducing a keyword 'block' would break
lots of programs, among it half of PyPy.  Unlike many other
keywords 'block' is a pretty common variable name.  For
invoking blocktemplates i like the no-keyword approach, instead. 

However, i would find it much clearer if *defining* blocktemplates 
used a new keyword, like: 

    blocktemplate opening(filename, mode="r"): 
        ... 

because this immediately tells me what the purpose and semantics
of the folowing definition is.   The original overloading of 'def' to 
mean generators if the body contains a yield statement was already a 
matter of discussion (ASFAIK).  When i came to Python it was at 2.2
and i remember wondering about this "def" oddity. 

Extending poor old 'def' functions now to possibly mean block
templates gives me semantical overload even if it is justified
from an implementation point of view.  I am talking purely 
about (my sense of) code readability here not about implementation. 

cheers, 

    holger

From gmilas at gmail.com  Tue May  3 21:50:11 2005
From: gmilas at gmail.com (Gheorghe Milas)
Date: Tue, 3 May 2005 19:50:11 +0000 (UTC)
Subject: [Python-Dev] 2 words keyword for block
Message-ID: <loom.20050503T214709-362@post.gmane.org>

I'm not really in position to speak but since I only saw people trying to 
come up with a keyword only using one word and without much success I would 
venture to suggest the possibility of making a keyword out of two words.
Would there be a huge problem to use 2 words to make up a keyword?

like for example <in template> or <intemplate> if using space is a real problem 

in template thread_safe(lock):
in template redirected_stdout(stream):
in template use_and_close_file(path) as file:
in template as_transaction():
in template auto_retry(times=3, failas=IOError):


From gvanrossum at gmail.com  Tue May  3 22:20:38 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 13:20:38 -0700
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <20050503201400.GE30548@solar.trillke.net>
References: <ca471dc20505021755518773c8@mail.gmail.com>
	<20050503201400.GE30548@solar.trillke.net>
Message-ID: <ca471dc2050503132010abb4df@mail.gmail.com>

[Holger]
> > 1. Decide on a keyword to use, if any.
> 
> I just read the PEP340 basically the first time so bear with me.

Thanks for reviewing!

> First i note that introducing a keyword 'block' would break
> lots of programs, among it half of PyPy.  Unlike many other
> keywords 'block' is a pretty common variable name.  For
> invoking blocktemplates i like the no-keyword approach, instead.

Good point (the code from Queue.py quoted by Jim Jewett also uses
block as a variable name :-). There has been much argument on both
sides. I guess we may need to have a subcommittee to select the
keyword (if any) ...

Maybe if we can't go without a keyword, 'with' would be okay after
all; I'm not so strongly in favor of a Pascal/VB-style with-statement
after reading the C# developers' comments (see reference in the PEP).

> However, i would find it much clearer if *defining* blocktemplates
> used a new keyword, like:
> 
>     blocktemplate opening(filename, mode="r"):
>         ...
> 
> because this immediately tells me what the purpose and semantics
> of the folowing definition is.   The original overloading of 'def' to
> mean generators if the body contains a yield statement was already a
> matter of discussion (ASFAIK).  When i came to Python it was at 2.2
> and i remember wondering about this "def" oddity.
> 
> Extending poor old 'def' functions now to possibly mean block
> templates gives me semantical overload even if it is justified
> from an implementation point of view.  I am talking purely
> about (my sense of) code readability here not about implementation.

Hm... Maybe you also want to have separate function and procedure
keywords? Or static typing? 'def' can be used to define all sorts of
things, that is Python's beauty!

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From tim.peters at gmail.com  Tue May  3 22:21:35 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Tue, 3 May 2005 16:21:35 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <ca471dc205050312485bde01fe@mail.gmail.com>
References: <ca471dc2050503095336305c8b@mail.gmail.com>
	<001101c5500d$a7be7140$c704a044@oemcomputer>
	<ca471dc205050311481385b64f@mail.gmail.com>
	<1f7befae050503121346833d97@mail.gmail.com>
	<ca471dc205050312485bde01fe@mail.gmail.com>
Message-ID: <1f7befae05050313212db5d4df@mail.gmail.com>

[Tim]
>> Because Queue does use condvars now instead of plain locks, I wouldn't
>> approve of any gimmick purporting to hide the acquire/release's in
>> put() or get():  that those are visible is necessary to seeing that
>> the _condvar_ protocol is being followed ("must acquire() before
>> wait(); must be acquire()'ed during notify(); no path should leave the
>> condvar acquire()d 'for a long time' before a wait() or release()").

[Guido]
> So you think that this would be obscure? A generic condition variable
> use could look like this:
> 
>    block locking(self.condvar):
>        while not self.items:
>            self.condvar.wait()
>        self.process(self.items)
>        self.items = []
> 
> instead of this:
> 
>    self.condvar.acquire()
>    try:
>        while not self.items:
>            self.condvar.wait()
>        self.process(self.items)
>        self.items = []
>    finally:
>        self.condvar.release()
>
> I find that the "block locking" version looks just fine; it makes the
> scope of the condition variable quite clear despite not having any
> explicit acquire() or release() calls (there are some abstracted away
> in the wait() call too!).

Actually typing it all out like that makes it hard to dislike <wink>. 
Yup, that reads fine to me too.

I don't think anyone has mentioned this yet, so I will:  library
writers using Decimal (or more generally HW 754 gimmicks) have a need
to fiddle lots of thread-local state ("numeric context"), and must
restore it no matter how the routine exits.  Like "boost precision to
twice the user's value over the next 12 computations, then restore",
and "no matter what happens here, restore the incoming value of the
overflow-happened flag".  It's just another instance of temporarily
taking over a shared resource, but I think it's worth mentioning that
there are a lot of things "like that" in the world, and to which
decorators don't really sanely apply.

From jcarlson at uci.edu  Tue May  3 22:39:21 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Tue, 03 May 2005 13:39:21 -0700
Subject: [Python-Dev] Need to hook Py_FatalError
In-Reply-To: <Xns964BE20F692E9token@80.91.229.5>
References: <20050503112637.648F.JCARLSON@uci.edu>
	<Xns964BE20F692E9token@80.91.229.5>
Message-ID: <20050503132639.6492.JCARLSON@uci.edu>


"m.u.k" <m.u.k.2 at gawab.com> wrote:
> Josiah Carlson <jcarlson at uci.edu> wrote in
> news:20050503112637.648F.JCARLSON at uci.edu: 
> 
> > Offering any hook for Py_FatalError may not even be enough, as some of
> > those errors are caused by insufficient memory.  What if a hook were
> > available, but it couldn't be called because there wasn't enough memory?
> > 
> > Of course there is the option of pre-allocating a few kilobytes, then
> > just before one calls the hook, freeing that memory so that the hook can
> > execute (assuming the hook is small enough).  I'm not sure if this is a
> > desireable general mechanic, but it may be sufficient for you.  If you
> > do figure out a logging mechanism that is almost guaranteed to execute
> > on FatalError, post it to sourceforge.
>  
> IMHO this should be left to hooker(apparerently not right word, but you get 
> the point :) ). If he allocates more mem. or does heavy stuff, that will just 
> fail. Anyway abort() is a failure too. Either abort() will end the process or 
> OS will on such a critical error.

I'm not talking about doing memory-intensive callbacks, I'm talking
about the function call itself.

From what I understand, any function call in Python requires a memory
allocation. This is trivially true in the case of rentrant Python calls;
which requires the allocation of a frame object from heap memory, and in
the case of all calls, from C stack memory. If you cannot allocate a
frame for __del__ method calling (one of the error conditions), you
certainly aren't going to be able to call a Python callback (no heap
memory), and may not have enough stack memory required by your logging
function; even if it is written in C (especially if you construct a
nontrivial portion of the message in memory before it is printed).

If I'm wrong, I'd like to hear it, but I'm still waiting for your patch
on sourceforge.
 - Josiah


From hpk at trillke.net  Tue May  3 23:26:36 2005
From: hpk at trillke.net (holger krekel)
Date: Tue, 3 May 2005 23:26:36 +0200
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <ca471dc2050503132010abb4df@mail.gmail.com>
References: <ca471dc20505021755518773c8@mail.gmail.com>
	<20050503201400.GE30548@solar.trillke.net>
	<ca471dc2050503132010abb4df@mail.gmail.com>
Message-ID: <20050503212636.GF30548@solar.trillke.net>

[Guido]
> [Holger]
> > However, i would find it much clearer if *defining* blocktemplates
> > used a new keyword, like:
> > 
> >     blocktemplate opening(filename, mode="r"):
> >         ...
> > 
> > because this immediately tells me what the purpose and semantics
> > of the folowing definition is.   The original overloading of 'def' to
> > mean generators if the body contains a yield statement was already a
> > matter of discussion (ASFAIK).  When i came to Python it was at 2.2
> > and i remember wondering about this "def" oddity.
> > 
> > Extending poor old 'def' functions now to possibly mean block
> > templates gives me semantical overload even if it is justified
> > from an implementation point of view.  I am talking purely
> > about (my sense of) code readability here not about implementation.
> 
> Hm... Maybe you also want to have separate function and procedure
> keywords? Or static typing? 'def' can be used to define all sorts of
> things, that is Python's beauty!

Sure, 'def' is nice and i certainly wouldn't introduce 
a new keyword for adding e.g. static typing to function 'defs'.  

But for my taste, blocktemplates derive enough from the
old-style function/sub-routine notion that many people still
think of when seing a 'def'.   When (new) people would see
something like 'blocktemplate ...:' they know they have to
look it up in the language documentation or in some book under
'blocktemplate' instead of trying to figure out (what the
hell) this "function" or "generator" does and how they can 
use it.  Or they might simply think they can invoke it from a 
for-loop which - as far as i understand - could lead to 
silent errors, no? 

Let me add that with the growing number of Python programmers
(as stated in your Pycon2005 keynote) it seems to make sense to
increase  emphasis on how new syntax/concepts will be viewed/used
by possibly 100'dreds of thousands of programmers already
familiar with (some version of) Python. 

But i also see your point of confronting people with
the fact that Python has a nice unified 'def' statement 
so i guess it's a balancing act.  

cheers, 

    holger

From python at rcn.com  Tue May  3 23:30:23 2005
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 3 May 2005 17:30:23 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <ca471dc205050311481385b64f@mail.gmail.com>
Message-ID: <001901c55027$50c101e0$c704a044@oemcomputer>

> > In contrast, the new use of yield differs in that the suspended
frame
> > transfers control from the encloser to the enclosed.
> 
> Why does your notion of who encloses whom suddenly reverse when you go
> from a for-loop to a block-statement? This all feels very strange to
> me.

After another reading of the PEP, it seems fine.

On the earlier readings, the "yield" felt disorienting because the body
of the block is subordinate to the block-iterator yet its code is
co-located with the caller (albeit set-off with a colon and
indentation).



> I meant to use this as an example of the unsuitability of the
> @synchronized decorator, since it implies that all synchronization is
> on the same mutex, thereby providing a use case for the locking
> block-statement.
> 
> I suspect we're violently in agreement though.

Right :-)



> > This is the strongest example so far.  When adding it to the PEP, it
> > would be useful to contrast the code with simpler alternatives like
PEP
> > 288's g.throw() or PEP 325's g.close().  On the plus side, the
> > block-iterator approach factors out code common to multiple callers.
On
> > the minus side, the other PEPs involve simpler mechanisms and their
> > learning curve would be nearly zero.  These pluses and minuses are
> > important because apply equally to all examples using blocks for
> > initialization/finalization.
> 
> Where do you see a learning curve for blocks?

Altering the meaning of a for-loop; introducing a new keyword; extending
the semantics of "break" and "continue"; allowing try/finally inside a
generator; introducing new control flow; adding new magic methods
__next__ and __exit__; adding a new context for "as"; and tranforming
"yield" from statement semantics to expression semantics.  This isn't a
lightweight proposal and not one where we get transference of knowledge
from other languages (except for a few users of Ruby, Smalltalk, etc).

By comparision, g.throw() or g.close() are trivially simple approaches
to generator/iterator finalization.



In section on new for-loop specification, what is the purpose of "arg"?
Can it be replaced with the constant None?

        itr = iter(EXPR1)
        brk = False
        while True:
            try:
                VAR1 = next(itr, None)
            except StopIteration:
                brk = True
                break
            BLOCK1
        if brk:
            BLOCK2



In "block expr as var", can "var" be any lvalue?
  
    block context() as inputfil, outputfil, errorfil:
          for i, line in enumerate(inputfil):
               if not checkformat(line):
                    print >> errorfil, line
               else:
                    print >> outputfil, secret_recipe(line)
               


In re-reading the examples, it occurred to me that the word "block"
already has meaning in the context of threading.Lock.acquire() which has
an optional blocking argument defaulting to 1.


In example 4, consider adding a comment that the "continue" has its
normal (non-extending) meaning.


The examples should demonstrate the operation of the extended form of
"continue", "break", and "return" in the body of the block.



Raymond

From eric.nieuwland at xs4all.nl  Tue May  3 23:44:39 2005
From: eric.nieuwland at xs4all.nl (Eric Nieuwland)
Date: Tue, 3 May 2005 23:44:39 +0200
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <ca471dc2050503113127f938b0@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
Message-ID: <37179cd212e38a8e041b0d39ee0160de@xs4all.nl>

Guido van Rossum wrote:
> [Skip Montanaro]
>> To the casual observer, this
>> looks like "break" should break out of the loop:
>>
>>     while True:
>>         block synchronized(v1):
>>             ...
>>             if v1.field:
>>                 break
>>         time.sleep(1)
>
> Without 'block' this would be written as try/finally. And my point is
> that people just don't write try/finally inside a while loop very
> often (I found *no* examples in the entire standard library).

Errr... Dutch example: Dining Philosophers (Dijkstra)

--eric


From bjourne at gmail.com  Tue May  3 23:54:45 2005
From: bjourne at gmail.com (=?ISO-8859-1?Q?BJ=F6rn_Lindqvist?=)
Date: Tue, 3 May 2005 23:54:45 +0200
Subject: [Python-Dev] PEP 340: Only for try/finally?
In-Reply-To: <ca471dc2050503113127f938b0@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
Message-ID: <740c3aec05050314544178f57f@mail.gmail.com>

> >     Guido> How many try/finally statements have you written inside a loop?
> >     Guido> In my experience this is extreeeemely rare. I found no
> >     Guido> occurrences in the standard library.
> 
> [Skip again]
> > How'd we start talking about try/finally?
> 
> Because it provides by far the dominant use case for 'block'. The
> block-statement is intended to replace many boilerplace uses of
> try/finally. In addition, it's also a coroutine invocation primitive.

Maybe I'm not understanding something, but why should "block" only be
for less boilerplate in try/finally's? I spent an hour grepping
through the standard library and there are indeed lots of use cases
for some blocks to replace try/finallys. There are opportunities for
block opening(file) and block locked(mutex) everywhere!

But why stop there? Lots of functions that takes a callable as
argument could be upgraded to use the new block syntax. Because it is
a cool way to do template method, isn't it?  Take wrapper() in
curses/wrapper.py for example. Why have it like this:
wrapper(curses_wrapped_main) when you can have it like this:

.block wrapper():
.    (main program stuff)
.    (...)

Or assertRaises in unittest.py, why call it like this:

self.assertRaises(TypeError, lambda: a*x)

When you can squash the lambda like this:

.block self.assertRaises(TypeError):
.    a*x

Or for another use case, in gl-code you often write glBegin()..
glDrawBlah().. glEnd(). Make it properly indented!:

.block glNowDraw():    # glBegin(); yield; glEnd()
.    glDrawBlah()

Make your own repeat-until loop:

.def until(cond):
.    while True:
.        yield None
.        if cond:
.            break
.block until(lambda: s == "quit"):
.    s = sys.stdin.readline()

It seems like the possibilities are endless. Maybe too endless?
Because this new feature is so similar to anonymous functions, but is
not quite anonymous functions, so why not introduce anonymous
functions instead, that could make all the things block can, and more?
But as I said, I'm misunderstanding something.

-- 
mvh Bj?rn

From pje at telecommunity.com  Wed May  4 00:02:56 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 03 May 2005 18:02:56 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <001901c55027$50c101e0$c704a044@oemcomputer>
References: <ca471dc205050311481385b64f@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com>

At 05:30 PM 5/3/05 -0400, Raymond Hettinger wrote:
>By comparision, g.throw() or g.close() are trivially simple approaches
>to generator/iterator finalization.

That reminds me of something; in PEP 333 I proposed use of a 'close()' 
attribute in anticipation of PEP 325, so that web applications implemented 
as generators could take advantage of resource cleanup.  Is there any 
chance that as part of PEP 340, 'close()' could translate to the same as 
'__exit__(StopIteration)'?  If not, modifying PEP 333 to support '__exit__' 
is going to be a bit of a pain, especially since there's code in the field 
now with that assumption.


From gvanrossum at gmail.com  Wed May  4 00:04:46 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 15:04:46 -0700
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <001901c55027$50c101e0$c704a044@oemcomputer>
References: <ca471dc205050311481385b64f@mail.gmail.com>
	<001901c55027$50c101e0$c704a044@oemcomputer>
Message-ID: <ca471dc2050503150460c82d28@mail.gmail.com>

[Guido]
> > Where do you see a learning curve for blocks?

[Raymond]
> Altering the meaning of a for-loop; introducing a new keyword; extending
> the semantics of "break" and "continue"; allowing try/finally inside a
> generator; introducing new control flow; adding new magic methods
> __next__ and __exit__; adding a new context for "as"; and tranforming
> "yield" from statement semantics to expression semantics.  This isn't a
> lightweight proposal and not one where we get transference of knowledge
> from other languages (except for a few users of Ruby, Smalltalk, etc).

[Bah, gmail just lost my draft. :-( Trying to reconstruct...]

But there are several separable proposals in the PEP. Using "continue
EXPR" which calls its.__next__(EXPR) which becomes the return value of
a yield-expression is entirely orthogonal (and come to think of it the
PEP needs a motivating example for this).

And come to think of it, using a generator to "drive" a block
statement is also separable; with just the definition of the block
statement from the PEP you could implement all the examples using a
class (similar to example 6, which is easily turned into a template).

I think that seeing just two of the examples would be enough for most
people to figure out how to write their own, so that's not much of a
learning curve IMO.

> By comparision, g.throw() or g.close() are trivially simple approaches
> to generator/iterator finalization.

But much more clumsy to use since you have to write your own try/finally.

> In section on new for-loop specification, what is the purpose of "arg"?
> Can it be replaced with the constant None?

No, it is set by the "continue EXPR" translation given just below it.
I'll add a comment; other people also missed this.

> In "block expr as var", can "var" be any lvalue?

Yes. That's what I meant by "VAR1 is an arbitrary assignment target
(which may be a comma-separated list)". I'm adding an example that
shows this usage.

> In re-reading the examples, it occurred to me that the word "block"
> already has meaning in the context of threading.Lock.acquire() which has
> an optional blocking argument defaulting to 1.

Yeah, Holger also pointed out that block is a common variable name... :-(

> In example 4, consider adding a comment that the "continue" has its
> normal (non-extending) meaning.

I'd rather not, since this would just increase the confusion between
the body of the generator (where yield has a special meaning) vs. the
body of the block-statement (where continue, break, return and
exceptions have a special meaning). Also note example 5, which has a
yield inside a block-statement. This is the block statement's
equivalent to using a for-loop with a yield in its body in a regular
generator when it is invoking another iterator or generator
recursively.

> The examples should demonstrate the operation of the extended form of
> "continue", "break", and "return" in the body of the block.

Good point. (Although break and return don't really have an extended
form -- they just get new semantics in a block-statement.) I'll have
to think about those.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Wed May  4 00:11:40 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 03 May 2005 18:11:40 -0400
Subject: [Python-Dev] PEP 340: Only for try/finally?
In-Reply-To: <740c3aec05050314544178f57f@mail.gmail.com>
References: <ca471dc2050503113127f938b0@mail.gmail.com>
	<20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050503180546.03471d70@mail.telecommunity.com>

At 11:54 PM 5/3/05 +0200, BJ?rn Lindqvist wrote:
>It seems like the possibilities are endless. Maybe too endless?
>Because this new feature is so similar to anonymous functions, but is
>not quite anonymous functions, so why not introduce anonymous
>functions instead, that could make all the things block can, and more?
>But as I said, I'm misunderstanding something.

Anonymous functions can't rebind variables in their enclosing function.  It 
could be argued that it's better to fix this, rather than inventing a new 
macro-like facility, but I don't know how such a rebinding facility could 
preserve readability as well as PEP 340 does.

Also, many of your examples are indeed improvements over calling a function 
that takes a function.  The block syntax provides a guarantee that the 
block will be executed immediately or not at all.  Once you are past the 
block suite in the code, you know it will not be re-executed, because no 
reference to it is ever held by the called function.  You do not have this 
same guarantee when you see a function-taking-function being invoked.  So, 
a block suite tells you that the control flow is more-or-less linear, 
whereas a function definition raises the question of *when* that function 
will be executed, and whether you have exhaustive knowledge of the possible 
places from which it may be called.



From nidoizo at yahoo.com  Wed May  4 00:29:33 2005
From: nidoizo at yahoo.com (Nicolas Fleury)
Date: Tue, 03 May 2005 18:29:33 -0400
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <ca471dc2050503113127f938b0@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org>
	<427797D5.8030207@cirad.fr>	<17015.39213.522060.873605@montanaro.dyndns.org>	<ca471dc20505031013287a2e92@mail.gmail.com>	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
Message-ID: <d58tkb$fvp$1@sea.gmane.org>

Guido van Rossum wrote:
> [Skip Montanaro]
>>    Guido> How many try/finally statements have you written inside a loop?
>>    Guido> In my experience this is extreeeemely rare. I found no
>>    Guido> occurrences in the standard library.
> 
>>How'd we start talking about try/finally?
> 
> Because it provides by far the dominant use case for 'block'. The
> block-statement is intended to replace many boilerplace uses of
> try/finally. In addition, it's also a coroutine invocation primitive.

I would expect programmers to do more than only replace existing 
try/finally blocks.  The support for RAII patterns in Python might 
result in more use of RAII primitives and some may fit very well inside 
a loop.  It might not be a bad idea to look at what other languages are 
doing with RAII.  Also, even if there's no occurence right now in the 
standard library, it doesn't mean it has always been the case in the 
code evolution, where debugging such pitfall would not be cool.

FWIW, I expect most generators used in block-syntax to not be loops. 
What would imply to support these to pass "break" to parent loop at 
run-time?  Maybe generators are not the way to go, but could be 
supported natively by providing a __block__ function, very similarly to 
sequences providing an __iter__ function for for-loops?

We could avoid explaining to a newbie why the following code doesn't 
work if "opening" could be implemented in way that it works.

for filename in filenames:
     block opening(filename) as file:
         if someReason: break

By the way, FWIW, my preference if to have no keyword, making it clearer 
that some block statements are loops and others not, but probably 
amplifying the "break" problem.

Regards,
Nicolas


From gvanrossum at gmail.com  Wed May  4 00:33:37 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 15:33:37 -0700
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com>
References: <ca471dc205050311481385b64f@mail.gmail.com>
	<001901c55027$50c101e0$c704a044@oemcomputer>
	<5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com>
Message-ID: <ca471dc20505031533ab2da74@mail.gmail.com>

[Phillip]
> That reminds me of something; in PEP 333 I proposed use of a 'close()'
> attribute in anticipation of PEP 325, so that web applications implemented
> as generators could take advantage of resource cleanup.  Is there any
> chance that as part of PEP 340, 'close()' could translate to the same as
> '__exit__(StopIteration)'?  If not, modifying PEP 333 to support '__exit__'
> is going to be a bit of a pain, especially since there's code in the field
> now with that assumption.

Maybe if you drop support for the "separate protocol" alternative... :-)

I had never heard of that PEP. How much code is there in the field?
Written by whom?

I suppose you can always write a decorator that takes care of the
mapping. I suppose it should catch and ignore the StopIteration that
__exit__(StopIteration) is likely to throw.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From nbastin at opnet.com  Wed May  4 00:36:23 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Tue, 3 May 2005 18:36:23 -0400
Subject: [Python-Dev] Py_UNICODE madness
Message-ID: <ff0cffe6186fb31e1f3b5c9b3a252814@opnet.com>

The documentation for Py_UNICODE states the following:

"This type represents a 16-bit unsigned storage type which is used by  
Python internally as basis for holding Unicode ordinals. On platforms 
where wchar_t is available and also has 16-bits,  Py_UNICODE is a 
typedef alias for wchar_t to enhance  native platform compatibility. On 
all other platforms,  Py_UNICODE is a typedef alias for unsigned 
short."

However, we have found this not to be true on at least certain RedHat 
versions (maybe all, but I'm not willing to say that at this point).  
pyconfig.h on these systems reports that PY_UNICODE_TYPE is wchar_t, 
and PY_UNICODE_SIZE is 4.  Needless to say, this isn't consistent with 
the docs.  It also creates quite a few problems when attempting to 
interface Python with other libraries which produce unicode data.

Is this a bug, or is this behaviour intended?

It turns out that at some point in the past, this created problems for 
tkinter as well, so someone just changed the internal unicode 
representation in tkinter to be 4 bytes as well, rather than tracking 
down the real source of the problem.

Is PY_UNICODE_TYPE always going to be guaranteed to be 16 bits, or is 
it dependent on your platform? (in which case we can give up now on 
Python unicode compatibility with any other libraries).  At the very 
least, if we can't guarantee the internal representation, then the 
PyUnicode_FromUnicode API needs to go away, and be replaced with 
something capable of transcoding various unicode inputs into the 
internal python representation.

--
Nick


From gvanrossum at gmail.com  Wed May  4 00:39:04 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 15:39:04 -0700
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <d58tkb$fvp$1@sea.gmane.org>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
Message-ID: <ca471dc205050315392e00f98c@mail.gmail.com>

> FWIW, I expect most generators used in block-syntax to not be loops.
> What would imply to support these to pass "break" to parent loop at
> run-time?

I proposed this at some point during the discussion leading up to the
PEP and it was boohed away as too fragile (and I agree). You're just
going to have to learn to deal with it, just as you can't break out of
two nested loops (but you can return from the innermost loop).

> Maybe generators are not the way to go, but could be
> supported natively by providing a __block__ function, very similarly to
> sequences providing an __iter__ function for for-loops?

Sorry, I have no idea what you are proposing here.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Wed May  4 00:44:13 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 15:44:13 -0700
Subject: [Python-Dev] Py_UNICODE madness
In-Reply-To: <ff0cffe6186fb31e1f3b5c9b3a252814@opnet.com>
References: <ff0cffe6186fb31e1f3b5c9b3a252814@opnet.com>
Message-ID: <ca471dc20505031544373d8c14@mail.gmail.com>

I think that documentation is wrong; AFAIK Py_UNICODE has always been
allowed to be either 16 or 32 bits, and the source code goes through
great lengths to make sure that you get a link error if you try to
combine extensions built with different assumptions about its size.

On 5/3/05, Nicholas Bastin <nbastin at opnet.com> wrote:
> The documentation for Py_UNICODE states the following:
> 
> "This type represents a 16-bit unsigned storage type which is used by
> Python internally as basis for holding Unicode ordinals. On platforms
> where wchar_t is available and also has 16-bits,  Py_UNICODE is a
> typedef alias for wchar_t to enhance  native platform compatibility. On
> all other platforms,  Py_UNICODE is a typedef alias for unsigned
> short."
> 
> However, we have found this not to be true on at least certain RedHat
> versions (maybe all, but I'm not willing to say that at this point).
> pyconfig.h on these systems reports that PY_UNICODE_TYPE is wchar_t,
> and PY_UNICODE_SIZE is 4.  Needless to say, this isn't consistent with
> the docs.  It also creates quite a few problems when attempting to
> interface Python with other libraries which produce unicode data.
> 
> Is this a bug, or is this behaviour intended?
> 
> It turns out that at some point in the past, this created problems for
> tkinter as well, so someone just changed the internal unicode
> representation in tkinter to be 4 bytes as well, rather than tracking
> down the real source of the problem.
> 
> Is PY_UNICODE_TYPE always going to be guaranteed to be 16 bits, or is
> it dependent on your platform? (in which case we can give up now on
> Python unicode compatibility with any other libraries).  At the very
> least, if we can't guarantee the internal representation, then the
> PyUnicode_FromUnicode API needs to go away, and be replaced with
> something capable of transcoding various unicode inputs into the
> internal python representation.
> 
> --
> Nick
> 
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: http://mail.python.org/mailman/options/python-dev/guido%40python.org
> 


-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Wed May  4 00:44:13 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 15:44:13 -0700
Subject: [Python-Dev] Py_UNICODE madness
In-Reply-To: <ff0cffe6186fb31e1f3b5c9b3a252814@opnet.com>
References: <ff0cffe6186fb31e1f3b5c9b3a252814@opnet.com>
Message-ID: <ca471dc20505031544373d8c14@mail.gmail.com>

I think that documentation is wrong; AFAIK Py_UNICODE has always been
allowed to be either 16 or 32 bits, and the source code goes through
great lengths to make sure that you get a link error if you try to
combine extensions built with different assumptions about its size.

On 5/3/05, Nicholas Bastin <nbastin at opnet.com> wrote:
> The documentation for Py_UNICODE states the following:
> 
> "This type represents a 16-bit unsigned storage type which is used by
> Python internally as basis for holding Unicode ordinals. On platforms
> where wchar_t is available and also has 16-bits,  Py_UNICODE is a
> typedef alias for wchar_t to enhance  native platform compatibility. On
> all other platforms,  Py_UNICODE is a typedef alias for unsigned
> short."
> 
> However, we have found this not to be true on at least certain RedHat
> versions (maybe all, but I'm not willing to say that at this point).
> pyconfig.h on these systems reports that PY_UNICODE_TYPE is wchar_t,
> and PY_UNICODE_SIZE is 4.  Needless to say, this isn't consistent with
> the docs.  It also creates quite a few problems when attempting to
> interface Python with other libraries which produce unicode data.
> 
> Is this a bug, or is this behaviour intended?
> 
> It turns out that at some point in the past, this created problems for
> tkinter as well, so someone just changed the internal unicode
> representation in tkinter to be 4 bytes as well, rather than tracking
> down the real source of the problem.
> 
> Is PY_UNICODE_TYPE always going to be guaranteed to be 16 bits, or is
> it dependent on your platform? (in which case we can give up now on
> Python unicode compatibility with any other libraries).  At the very
> least, if we can't guarantee the internal representation, then the
> PyUnicode_FromUnicode API needs to go away, and be replaced with
> something capable of transcoding various unicode inputs into the
> internal python representation.
> 
> --
> Nick
> 
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: http://mail.python.org/mailman/options/python-dev/guido%40python.org
> 


-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From jimjjewett at gmail.com  Wed May  4 00:56:50 2005
From: jimjjewett at gmail.com (Jim Jewett)
Date: Tue, 3 May 2005 18:56:50 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <ca471dc20505031240c7ffc2b@mail.gmail.com>
References: <fb6fbf560505031207e6e8ff@mail.gmail.com>
	<ca471dc20505031240c7ffc2b@mail.gmail.com>
Message-ID: <fb6fbf560505031556a5e8121@mail.gmail.com>

Summary:  

Resource Managers are a good idea.
First Class Suites may be a good idea.

Block Iterators try to split the difference.  They're not as powerful
as First Class Suites, and not as straightforward as Resource 
Managers.  This particular middle ground didn't work out so well.

On 5/3/05, Guido van Rossum <gvanrossum at gmail.com> wrote:
> [Jim Jewett]
...
> > With the block, every yield returns to a single designated callback.
> > This callback had to be established at the same time the block was
> > created, and must be textually inside it.  (An indented suite to the
> > "block XXX:" line.)
 
> Doesn't convince me. The common use for a regular generator is in a
> for-loop, where every yield also returns to a single designated place
> (calling it callback is really deceptive!).

I do not consider the body of a for-loop a to be callback; the generator
has no knowledge of that body.

But with a Block Iterator, the generator (or rather, its unrolled version) 
does need to textually contain the to-be-included suite -- which is why 
that suite smells like a callback function that just doesn't happen to be 
named.
 
> And with a block, you're free to put the generator call ahead of the
> block so you can call next() on it manually:
> 
>     it = EXPR1
>     block it:
>         BLOCK1

> ... lets you call next() on it as you please (until the
> block is exited, for sure).

For a Resource Manager, the only thing this could do is effectively
discard the BLOCK1, because the yields would have been used
up (and the resource deallocated).  

I suppose this is another spelling of "resources are not loops".

> > But are there plenty of other use cases for PEP 340?
 
> Yes. Patterns like "do this little dance in a try/finally block" and
> "perform this tune when you catch an XYZ exception" are pretty common

...

Let me rephrase ...

The Block Iterator syntax gets awkward if it needs to yield more than
once (and the exits are not interchangable).  You have said that is OK 
because most Resource Managers only yield once.

But if you're willing to accept that, then why not just limit it to a Resource
Manager instead of an Iterator?  Resource Managers could look similar 
to the current proposal, but would be less ambitious.  They should have 
absolutely no connection to loops/iterators/generators.  There should be
no internal secret loop.  if they use the "yield" keyword, it should be 
described as "yielding control" rather than "yielding the next value."  There 
would be only one yielding of control per Resource Manager.

If limiting the concept to Resource Managers is not acceptable, then
I still don't think Block Iterators are the right answer -- though First Class
Suites might be.  (And so might "No Changes at all".)

Reasoning:

If there is only one yield, then you're really just wrapping the call to 
the (unnamed) suite.

(Q)    Why are decorators not appropriate? 

(A1)   In some cases, the wrapper needs to capture an 
instance-variable, which isn't available at definition-time.
(A2)   Decorators can be ugly.  This is often because the
need to return a complete replacement callable leads to too 
many nested functions.

These are both problems with decorators.  They do argue for
improving the decorator syntax, but not for throwing out the
concept.  I don't think that Block Iterators will really clear things 
up -- to me, they just look like a different variety of fog.

If decoration doesn't work, why not use a regular function
that takes a callback?  Pass the callback instead of defining an
anonymous suite.  Call the callback instead of writing the single
yield.

...

> ... you are proposing to solve all its use cases by defining an
> explicit function or method representing the body of the block. 

Yes.

> The latter solution leads to way too much ugly code -- all that
> function-definition boilerplate is worse than the try/finally 
> boilerplate we're trying to hide!

In the cases I've actually seen, the ugly function definition portions
are in the decorator, rather than the regular function.  It trades a
little ugliness that gets repeated all over the place for a lot of ugliness
that happens only once (in the decorator).

That said, I'm willing to believe that breaking out a method might 
sometimes be a bad idea.  In which case you probably want 
First Class (and decorable) Suites.

If First Class Suites are not acceptable in general, then let's figure
out where they are acceptable.  For me, Resource Manager is a good
use case, but Block Iterator is not.

-jJ

From gvanrossum at gmail.com  Wed May  4 01:08:29 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 16:08:29 -0700
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <fb6fbf560505031556a5e8121@mail.gmail.com>
References: <fb6fbf560505031207e6e8ff@mail.gmail.com>
	<ca471dc20505031240c7ffc2b@mail.gmail.com>
	<fb6fbf560505031556a5e8121@mail.gmail.com>
Message-ID: <ca471dc205050316084f2ed151@mail.gmail.com>

Sorry Jim, but I just don't think you & I were intended to be on the
same language design committee. Nothing you say seems to be making any
sense to me these days. Maybe someone else can channel you
effectively, but I'm not going to try to do a line-by-line response to
your email quoted below.

On 5/3/05, Jim Jewett <jimjjewett at gmail.com> wrote:
> Summary:
> 
> Resource Managers are a good idea.
> First Class Suites may be a good idea.
> 
> Block Iterators try to split the difference.  They're not as powerful
> as First Class Suites, and not as straightforward as Resource
> Managers.  This particular middle ground didn't work out so well.
> 
> On 5/3/05, Guido van Rossum <gvanrossum at gmail.com> wrote:
> > [Jim Jewett]
> ...
> > > With the block, every yield returns to a single designated callback.
> > > This callback had to be established at the same time the block was
> > > created, and must be textually inside it.  (An indented suite to the
> > > "block XXX:" line.)
> 
> > Doesn't convince me. The common use for a regular generator is in a
> > for-loop, where every yield also returns to a single designated place
> > (calling it callback is really deceptive!).
> 
> I do not consider the body of a for-loop a to be callback; the generator
> has no knowledge of that body.
> 
> But with a Block Iterator, the generator (or rather, its unrolled version)
> does need to textually contain the to-be-included suite -- which is why
> that suite smells like a callback function that just doesn't happen to be
> named.
> 
> > And with a block, you're free to put the generator call ahead of the
> > block so you can call next() on it manually:
> >
> >     it = EXPR1
> >     block it:
> >         BLOCK1
> 
> > ... lets you call next() on it as you please (until the
> > block is exited, for sure).
> 
> For a Resource Manager, the only thing this could do is effectively
> discard the BLOCK1, because the yields would have been used
> up (and the resource deallocated).
> 
> I suppose this is another spelling of "resources are not loops".
> 
> > > But are there plenty of other use cases for PEP 340?
> 
> > Yes. Patterns like "do this little dance in a try/finally block" and
> > "perform this tune when you catch an XYZ exception" are pretty common
> 
> ...
> 
> Let me rephrase ...
> 
> The Block Iterator syntax gets awkward if it needs to yield more than
> once (and the exits are not interchangable).  You have said that is OK
> because most Resource Managers only yield once.
> 
> But if you're willing to accept that, then why not just limit it to a Resource
> Manager instead of an Iterator?  Resource Managers could look similar
> to the current proposal, but would be less ambitious.  They should have
> absolutely no connection to loops/iterators/generators.  There should be
> no internal secret loop.  if they use the "yield" keyword, it should be
> described as "yielding control" rather than "yielding the next value."  There
> would be only one yielding of control per Resource Manager.
> 
> If limiting the concept to Resource Managers is not acceptable, then
> I still don't think Block Iterators are the right answer -- though First Class
> Suites might be.  (And so might "No Changes at all".)
> 
> Reasoning:
> 
> If there is only one yield, then you're really just wrapping the call to
> the (unnamed) suite.
> 
> (Q)    Why are decorators not appropriate?
> 
> (A1)   In some cases, the wrapper needs to capture an
> instance-variable, which isn't available at definition-time.
> (A2)   Decorators can be ugly.  This is often because the
> need to return a complete replacement callable leads to too
> many nested functions.
> 
> These are both problems with decorators.  They do argue for
> improving the decorator syntax, but not for throwing out the
> concept.  I don't think that Block Iterators will really clear things
> up -- to me, they just look like a different variety of fog.
> 
> If decoration doesn't work, why not use a regular function
> that takes a callback?  Pass the callback instead of defining an
> anonymous suite.  Call the callback instead of writing the single
> yield.
> 
> ...
> 
> > ... you are proposing to solve all its use cases by defining an
> > explicit function or method representing the body of the block.
> 
> Yes.
> 
> > The latter solution leads to way too much ugly code -- all that
> > function-definition boilerplate is worse than the try/finally
> > boilerplate we're trying to hide!
> 
> In the cases I've actually seen, the ugly function definition portions
> are in the decorator, rather than the regular function.  It trades a
> little ugliness that gets repeated all over the place for a lot of ugliness
> that happens only once (in the decorator).
> 
> That said, I'm willing to believe that breaking out a method might
> sometimes be a bad idea.  In which case you probably want
> First Class (and decorable) Suites.
> 
> If First Class Suites are not acceptable in general, then let's figure
> out where they are acceptable.  For me, Resource Manager is a good
> use case, but Block Iterator is not.
> 
> -jJ
> 


-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Wed May  4 01:27:42 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 03 May 2005 19:27:42 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <ca471dc20505031533ab2da74@mail.gmail.com>
References: <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com>
	<ca471dc205050311481385b64f@mail.gmail.com>
	<001901c55027$50c101e0$c704a044@oemcomputer>
	<5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com>

At 03:33 PM 5/3/05 -0700, Guido van Rossum wrote:
>[Phillip]
> > That reminds me of something; in PEP 333 I proposed use of a 'close()'
> > attribute in anticipation of PEP 325, so that web applications implemented
> > as generators could take advantage of resource cleanup.  Is there any
> > chance that as part of PEP 340, 'close()' could translate to the same as
> > '__exit__(StopIteration)'?  If not, modifying PEP 333 to support '__exit__'
> > is going to be a bit of a pain, especially since there's code in the field
> > now with that assumption.
>
>Maybe if you drop support for the "separate protocol" alternative... :-)

I don't understand you.  Are you suggesting a horse trade, or...?


>I had never heard of that PEP. How much code is there in the field?

Maybe a dozen or so web applications and frameworks (including Zope, 
Quixote, PyBlosxom) and maybe a half dozen servers (incl. Twisted and 
mod_python).  A lot of the servers are based on my wsgiref library, though, 
so it probably wouldn't be too horrible a job to make everybody add 
support; I might even be able to fudge wsgiref so that wsgiref-based 
servers don't even see an issue.

Modifying the spec is potentially more controversial, however; it'll have 
to go past the Web-SIG, and I assume the first thing that'll be asked is, 
"Why aren't generators getting a close() method then?", so I figured I 
should ask that question first.

I'd completely forgotten about this being an issue until Raymond mentioned 
g.close(); I'd previously gotten the impression that PEP 325 was expected 
to be approved, otherwise I wouldn't have written support for it into PEP 333.


>Written by whom?

I used to know who all had written implementations, but there are now too 
many to keep track of.


From pje at telecommunity.com  Wed May  4 01:36:47 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 03 May 2005 19:36:47 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com>
References: <ca471dc20505031533ab2da74@mail.gmail.com>
	<5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com>
	<ca471dc205050311481385b64f@mail.gmail.com>
	<001901c55027$50c101e0$c704a044@oemcomputer>
	<5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050503193348.031096f0@mail.telecommunity.com>

At 07:27 PM 5/3/05 -0400, Phillip J. Eby wrote:
>Modifying the spec is potentially more controversial, however; it'll have
>to go past the Web-SIG, and I assume the first thing that'll be asked is,
>"Why aren't generators getting a close() method then?", so I figured I
>should ask that question first.

You know what, never mind.  I'm still going to write the Web-SIG so they 
know the change is coming, but this is really a very minor thing; just a 
feature we won't get "for free" as a side effect of PEP 325.

Your decorator idea is a trivial solution, but it would also be trivial to 
allow WSGI server implementations to call __exit__ on generators.  None of 
this affects existing code in the field, because today you can't write a 
try/finally in a generator anyway.  Therefore, nobody is relying on this 
feature, therefore it's basically moot.


From gvanrossum at gmail.com  Wed May  4 01:41:53 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 16:41:53 -0700
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com>
References: <ca471dc205050311481385b64f@mail.gmail.com>
	<001901c55027$50c101e0$c704a044@oemcomputer>
	<5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com>
	<ca471dc20505031533ab2da74@mail.gmail.com>
	<5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com>
Message-ID: <ca471dc2050503164127505290@mail.gmail.com>

> >Maybe if you drop support for the "separate protocol" alternative... :-)
> 
> I don't understand you.  Are you suggesting a horse trade, or...?

Only tongue-in-cheek. :-)

> >I had never heard of that PEP. How much code is there in the field?
> 
> Maybe a dozen or so web applications and frameworks (including Zope,
> Quixote, PyBlosxom) and maybe a half dozen servers (incl. Twisted and
> mod_python).  A lot of the servers are based on my wsgiref library, though,
> so it probably wouldn't be too horrible a job to make everybody add
> support; I might even be able to fudge wsgiref so that wsgiref-based
> servers don't even see an issue.
> 
> Modifying the spec is potentially more controversial, however; it'll have
> to go past the Web-SIG, and I assume the first thing that'll be asked is,
> "Why aren't generators getting a close() method then?", so I figured I
> should ask that question first.
> 
> I'd completely forgotten about this being an issue until Raymond mentioned
> g.close(); I'd previously gotten the impression that PEP 325 was expected
> to be approved, otherwise I wouldn't have written support for it into PEP 333.
> 
> >Written by whom?
> 
> I used to know who all had written implementations, but there are now too
> many to keep track of.

Given all that, it's not infeasible to add a close() method to
generators as a shortcut for this:

    def close(self):
        try:
            self.__exit__(StopIteration)
        except StopIteration:
            break
        else:
            # __exit__() didn't
            raise RuntimeError("or some other exception")

I'd like the block statement to be defined exclusively in terms of
__exit__() though.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From tdelaney at avaya.com  Wed May  4 01:45:38 2005
From: tdelaney at avaya.com (Delaney, Timothy C (Timothy))
Date: Wed, 4 May 2005 09:45:38 +1000
Subject: [Python-Dev] PEP 340 -- loose ends
Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE025204D9@au3010avexu1.global.avaya.com>

Another loose end (which can partially explain why I still thought
__next__ took an exception ;)

In "Specification: Generator Exit Handling"::

    "When __next__() is called with an argument that is not None, the
    yield-expression that it resumes will return the value attribute
    of the argument."

I think this should read::

    "When __next__() is called with an argument that is not None, the
    yield-expression that it resumes will return the argument."

Tim Delaney

From python at jwp.name  Wed May  4 01:54:27 2005
From: python at jwp.name (James William Pye)
Date: Tue, 03 May 2005 16:54:27 -0700
Subject: [Python-Dev] Need to hook Py_FatalError
In-Reply-To: <20050503175422.GF8344@unpythonic.net>
References: <Xns964B980C943F1token@80.91.229.5>
	<ca471dc205050309156962d3ff@mail.gmail.com>
	<20050503175422.GF8344@unpythonic.net>
Message-ID: <1115164467.62180.17.camel@localhost>

On Tue, 2005-05-03 at 12:54 -0500, Jeff Epler wrote:
> On Tue, May 03, 2005 at 09:15:42AM -0700, Guido van Rossum wrote:
> > But tell me, what do you want the process to do instead of
> > terminating? Py_FatalError is used in situations where raising an
> > exception is impossible or would do more harm than good.
> 
> In an application which embeds Python, I want to show the application's
> standard error dialog, which doesn't call any Python APIs (but does do
> things like capture the call stack at the time of the error).  For this
> use, it doesn't matter that no further calls to those APIs are possible.
> 
> Jeff

+1 Here.

In my case(postgresql), it would probably be wiser to map Py_Fatal's to
Postgres' ereport(FATAL,(...)), as it does appear to do some cleaning up
on exit, and if there's a remote user, it could actually give the user
the message.

[http://python.project.postgresql.org]
-- 
Regards, James William Pye

From python at rcn.com  Wed May  4 01:57:02 2005
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 3 May 2005 19:57:02 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <ca471dc2050503150460c82d28@mail.gmail.com>
Message-ID: <000601c5503b$cdaa26a0$41b6958d@oemcomputer>

> But there are several separable proposals in the PEP. Using "continue
> EXPR" which calls its.__next__(EXPR) which becomes the return value of
> a yield-expression is entirely orthogonal (and come to think of it the
> PEP needs a motivating example for this).
> 
> And come to think of it, using a generator to "drive" a block
> statement is also separable; with just the definition of the block
> statement from the PEP you could implement all the examples using a
> class (similar to example 6, which is easily turned into a template).

I think that realization is important.  It would be great to have a
section of the PEP that focuses on separability and matching features to
benefits.  Start with above observation that the proposed examples can
be achieved with generators driving the block statement.

When the discussion hits comp.lang.python, a separability section will
help focus the conversation (there's a flaw/issue/dislike about feature
x; however, features y/z and related benefits do not depend on x).

Essentially, having generators as block drivers is the base proposal.
Everything else is an elaboration.



> > By comparision, g.throw() or g.close() are trivially simple
approaches
> > to generator/iterator finalization.
> 
> But much more clumsy to use since you have to write your own
try/finally.

Sometimes easy makes up for clumsy.



> > In re-reading the examples, it occurred to me that the word "block"
> > already has meaning in the context of threading.Lock.acquire() which
has
> > an optional blocking argument defaulting to 1.
> 
> Yeah, Holger also pointed out that block is a common variable name...
:-(

Someone mentioned "suite" as a suitable alternative.  That word seems to
encompass the same conceptual space without encroaching on existing
variable and argument names.  

Also, "suite" reads as a noun.  In contrast, "block" has a verb form
that too easily misconnects with the name of the block-iterator
expression -- what comes to mind when you see block sender() or block
next_message().



Performance-wise, I cringe at the thought of adding any weight at all to
the for-loop semantics.  The current version is super lightweight and
clean.  Adding anything to it will likely have a comparatively strong
negative effect on timings.  It's too early for that discussion, but
keep it in mind.


That's pretty much it for my first readings of the PEP.  All-in-all it
has come together nicely.



Raymond

From python at rcn.com  Wed May  4 01:59:31 2005
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 3 May 2005 19:59:31 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <ca471dc2050503164127505290@mail.gmail.com>
Message-ID: <000701c5503c$265212e0$41b6958d@oemcomputer>

> it's not infeasible to add a close() method to
> generators as a shortcut for this:
> 
>     def close(self):
>         try:
>             self.__exit__(StopIteration)
>         except StopIteration:
>             break
>         else:
>             # __exit__() didn't
>             raise RuntimeError("or some other exception")
> 
> I'd like the block statement to be defined exclusively in terms of
> __exit__() though.

That sounds like a winner.



Raymond

From pje at telecommunity.com  Wed May  4 02:05:23 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 03 May 2005 20:05:23 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <ca471dc2050503164127505290@mail.gmail.com>
References: <5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com>
	<ca471dc205050311481385b64f@mail.gmail.com>
	<001901c55027$50c101e0$c704a044@oemcomputer>
	<5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com>
	<ca471dc20505031533ab2da74@mail.gmail.com>
	<5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com>

At 04:41 PM 5/3/05 -0700, Guido van Rossum wrote:
>Given all that, it's not infeasible to add a close() method to
>generators as a shortcut for this:
>
>     def close(self):
>         try:
>             self.__exit__(StopIteration)
>         except StopIteration:
>             break
>         else:
>             # __exit__() didn't
>             raise RuntimeError("or some other exception")
>
>I'd like the block statement to be defined exclusively in terms of
>__exit__() though.

Sure.  PEP 325 proposes a "CloseGenerator" exception in place of 
"StopIteration", however, because:

     """
     Issues: should StopIteration be reused for this purpose?  Probably
     not.  We would like close to be a harmless operation for legacy
     generators, which could contain code catching StopIteration to
     deal with other generators/iterators.
     """

I don't know enough about the issue to offer either support or opposition 
for this idea, though.


From gvanrossum at gmail.com  Wed May  4 02:07:39 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 17:07:39 -0700
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <000601c5503b$cdaa26a0$41b6958d@oemcomputer>
References: <ca471dc2050503150460c82d28@mail.gmail.com>
	<000601c5503b$cdaa26a0$41b6958d@oemcomputer>
Message-ID: <ca471dc2050503170748e63430@mail.gmail.com>

[Guido]
> > And come to think of it, using a generator to "drive" a block
> > statement is also separable; with just the definition of the block
> > statement from the PEP you could implement all the examples using a
> > class (similar to example 6, which is easily turned into a template).

[Raymond Hettinger]
> I think that realization is important.  It would be great to have a
> section of the PEP that focuses on separability and matching features to
> benefits.  Start with above observation that the proposed examples can
> be achieved with generators driving the block statement.

Good idea. I'm kind of stuck for time (have used up most of my Python
time for the next few weeks) -- if you or someone else could volunteer
some text I'd appreciate it.

> When the discussion hits comp.lang.python, a separability section will
> help focus the conversation (there's a flaw/issue/dislike about feature
> x; however, features y/z and related benefits do not depend on x).

Right. The PEP started with me not worrying too much about motivation
or use cases but instead focusing on precise specification of the
mechanisms, since there was a lot of confusion over that. Now that's
out of the way, motivation (you might call it "spin" :-) becomes more
important.

> Essentially, having generators as block drivers is the base proposal.
> Everything else is an elaboration.

Right.

> Someone mentioned "suite" as a suitable alternative.  That word seems to
> encompass the same conceptual space without encroaching on existing
> variable and argument names.

Alas, the word "suite" is used extensively when describing Python's syntax.

> Performance-wise, I cringe at the thought of adding any weight at all to
> the for-loop semantics.  The current version is super lightweight and
> clean.  Adding anything to it will likely have a comparatively strong
> negative effect on timings.  It's too early for that discussion, but
> keep it in mind.

A for-loop without a "continue EXPR" in it shouldn't need to change at
all; the tp_iternext slot could be filled with either __next__ or next
whichever is defined.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Wed May  4 02:17:46 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 17:17:46 -0700
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com>
References: <ca471dc205050311481385b64f@mail.gmail.com>
	<001901c55027$50c101e0$c704a044@oemcomputer>
	<5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com>
	<ca471dc20505031533ab2da74@mail.gmail.com>
	<5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com>
	<ca471dc2050503164127505290@mail.gmail.com>
	<5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com>
Message-ID: <ca471dc205050317175dac1c9f@mail.gmail.com>

On 5/3/05, Phillip J. Eby <pje at telecommunity.com> wrote:
> At 04:41 PM 5/3/05 -0700, Guido van Rossum wrote:
> >Given all that, it's not infeasible to add a close() method to
> >generators as a shortcut for this:
> >
> >     def close(self):
> >         try:
> >             self.__exit__(StopIteration)
> >         except StopIteration:
> >             break
> >         else:
> >             # __exit__() didn't
> >             raise RuntimeError("or some other exception")
> >
> >I'd like the block statement to be defined exclusively in terms of
> >__exit__() though.

(So do you want this feature now or not? Earlier you said it was no big deal.)

> Sure.  PEP 325 proposes a "CloseGenerator" exception in place of
> "StopIteration", however, because:
> 
>      """
>      Issues: should StopIteration be reused for this purpose?  Probably
>      not.  We would like close to be a harmless operation for legacy
>      generators, which could contain code catching StopIteration to
>      deal with other generators/iterators.
>      """
> 
> I don't know enough about the issue to offer either support or opposition
> for this idea, though.

That would be an issue for the generator finalization proposed by the
PEP as well.

But I kind of doubt that it's an issue; you'd have to have a
try/except catching StopIteration around a yield statement that
resumes the generator before this becomes an issue, and that sounds
extremely improbable. If at all possible I'd rather not have to define
a new exception for this purpose.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From tdelaney at avaya.com  Wed May  4 02:31:18 2005
From: tdelaney at avaya.com (Delaney, Timothy C (Timothy))
Date: Wed, 4 May 2005 10:31:18 +1000
Subject: [Python-Dev] PEP 340 -- concept clarification
Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE721278@au3010avexu1.global.avaya.com>

Guido van Rossum wrote:

> I'd like the block statement to be defined exclusively in terms of
> __exit__() though.

This does actually suggest something to me (note - just a thought - no
real idea if it's got any merit).

Are there any use cases proposed for the block-statement (excluding the
for-loop) that do *not* involve resource cleanup (i.e. need an
__exit__)?

This could be the distinguishing feature between for-loops and
block-statements:

1. If an iterator declares __exit__, it cannot be used in a for-loop.
   For-loops do not guarantee resource cleanup.

2. If an iterator does not declare __exit__, it cannot be used in a
block-statement.
   Block-statements guarantee resource cleanup.

This gives separation of API (and thus purpose) whilst maintaining the
simplicity of the concept. Unfortunately, generators then become a pain
:( We would need additional syntax to declare that a generator was a
block generator.

OTOH, this may not be such a problem. Any generator that contains a
finally: around a yield automatically gets an __exit__, and any that
doesn't, doesn't. Although that feels *way* too magical to me (esp. in
light of my example below, which *doesn't* use finally). I'd prefer a
separate keyword for block generators. In that case, having finally:
around a yield would be a syntax error in a "normal" generator.

::

    resource locking(lock):
        lock.acquire()
        try:
            yield
        finally:
            lock.release()

    block locking(myLock):
        # Code here executes with myLock held.  The lock is
        # guaranteed to be released when the block is left (even
        # if via return or by an uncaught exception).

To use a (modified) example from another email::

    class TestCase:

        resource assertRaises (self, excClass):
            try:
                yield
            except excClass:
                return
            else:
                if hasattr(excClass, '__name__'): excName =
excClass.__name__
                else: excName = str(excClass)
                raise self.failureException, "%s is not raised" %
excName

    block self.assertRaises(TypeError):
        raise TypeError

Note that this *does* require cleanup, but without using a finally:
clause - the except: and else: are the cleanup code.

Tim Delaney

From python at rcn.com  Wed May  4 02:37:38 2005
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 3 May 2005 20:37:38 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <ca471dc2050503170748e63430@mail.gmail.com>
Message-ID: <000901c55041$7a06f5e0$41b6958d@oemcomputer>

> > I think that realization is important.  It would be great to have a
> > section of the PEP that focuses on separability and matching
features to
> > benefits.  Start with above observation that the proposed examples
can
> > be achieved with generators driving the block statement.
> 
> Good idea. I'm kind of stuck for time (have used up most of my Python
> time for the next few weeks) -- if you or someone else could volunteer
> some text I'd appreciate it.

I'll take a crack at it in the morning (we all seem to be on borrowed
time this week).



> > When the discussion hits comp.lang.python, a separability section
will
> > help focus the conversation (there's a flaw/issue/dislike about
feature
> > x; however, features y/z and related benefits do not depend on x).
> 
> Right. The PEP started with me not worrying too much about motivation
> or use cases but instead focusing on precise specification of the
> mechanisms, since there was a lot of confusion over that. Now that's
> out of the way, motivation (you might call it "spin" :-) becomes more
> important.

Perhaps the cover announcement should impart the initial spin as a
request for the community to create, explore, and learn from use cases.
That will help make the discussion more constructive, less abstract, and
more grounded in reality (wishful thinking).

That probably beats, "Here's 3500 words of proposal; do you like it?".



Raymond

From pje at telecommunity.com  Wed May  4 02:47:41 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 03 May 2005 20:47:41 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <ca471dc205050317175dac1c9f@mail.gmail.com>
References: <5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com>
	<ca471dc205050311481385b64f@mail.gmail.com>
	<001901c55027$50c101e0$c704a044@oemcomputer>
	<5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com>
	<ca471dc20505031533ab2da74@mail.gmail.com>
	<5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com>
	<ca471dc2050503164127505290@mail.gmail.com>
	<5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050503202653.02fa28f0@mail.telecommunity.com>

At 05:17 PM 5/3/05 -0700, Guido van Rossum wrote:
>(So do you want this feature now or not? Earlier you said it was no big deal.)

It *isn't* a big deal; but it'd still be nice, and I'd happily volunteer to 
do the actual implementation of the 'close()' method myself, because it's 
about the same amount of work as updating PEP 333 and sorting out any 
political issues that might arise therefrom.  :)


>But I kind of doubt that it's an issue; you'd have to have a
>try/except catching StopIteration around a yield statement that
>resumes the generator before this becomes an issue, and that sounds
>extremely improbable.

But it does exist, alas; see the 'itergroup()' and 'xmap()' functions of 
this cookbook recipe:

     http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/66448/

Or more pointedly, the 'roundrobin()' example in the Python 2.4 documentation:

     http://www.python.org/doc/lib/deque-recipes.html

And there are other examples as well:

     http://www.faqts.com/knowledge_base/view.phtml/aid/13516
     http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/141934



From pje at telecommunity.com  Wed May  4 02:55:32 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 03 May 2005 20:55:32 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <5.1.1.6.0.20050503202653.02fa28f0@mail.telecommunity.com>
References: <ca471dc205050317175dac1c9f@mail.gmail.com>
	<5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com>
	<ca471dc205050311481385b64f@mail.gmail.com>
	<001901c55027$50c101e0$c704a044@oemcomputer>
	<5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com>
	<ca471dc20505031533ab2da74@mail.gmail.com>
	<5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com>
	<ca471dc2050503164127505290@mail.gmail.com>
	<5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050503205342.03cc4250@mail.telecommunity.com>

At 08:47 PM 5/3/05 -0400, Phillip J. Eby wrote:
>      http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/141934

Oops; that one's not really a valid example; the except StopIteration just 
has a harmless "pass", and it's not in a loop.


From python at rcn.com  Wed May  4 03:00:49 2005
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 3 May 2005 21:00:49 -0400
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <5.1.1.6.0.20050503202653.02fa28f0@mail.telecommunity.com>
Message-ID: <001901c55044$b6cb4460$41b6958d@oemcomputer>

> >(So do you want this feature now or not? Earlier you said it was no
big
> deal.)
> 
> It *isn't* a big deal; but it'd still be nice, and I'd happily
volunteer
> to
> do the actual implementation of the 'close()' method myself, because
it's
> about the same amount of work as updating PEP 333 and sorting out any
> political issues that might arise therefrom.  :)

Can I recommend tabling this one for the time being.  My sense is that
it can be accepted independently of PEP 340 but that it should wait
until afterwards because the obvious right-thing-to-do will be
influenced by what happens with 340.

Everyone's bandwidth is being maxed-out at this stage.  So it is
somewhat helpful to keep focused on the core proposal of generator
driven block/suite thingies.


Raymond

From python at jwp.name  Wed May  4 03:54:46 2005
From: python at jwp.name (James William Pye)
Date: Tue, 03 May 2005 18:54:46 -0700
Subject: [Python-Dev] Need to hook Py_FatalError
In-Reply-To: <20050503132639.6492.JCARLSON@uci.edu>
References: <20050503112637.648F.JCARLSON@uci.edu>
	<Xns964BE20F692E9token@80.91.229.5>
	<20050503132639.6492.JCARLSON@uci.edu>
Message-ID: <1115171686.62180.48.camel@localhost>

On Tue, 2005-05-03 at 13:39 -0700, Josiah Carlson wrote:

> If I'm wrong, I'd like to hear it, but I'm still waiting for your patch
> on sourceforge.

Well, if he lost/loses interest for whatever reason, I'd be willing to
provide.

Although, if m.u.k. is going to write it, please be sure to include a
CPP macro/define, so that embedders could recognize the feature without
having to run explicit checks or do version based fingerprinting. (I'd
be interested to follow the patch if you(muk) put it up!)

Hrm, although, I don't think it would be wise to allow extension modules
to set this. IMO, there should be some attempt to protect it; ie, once
it's initialized, don't allow reinitialization, as if the embedder is
handling it, it should be handled through the duration of the process.
So, a static function pointer in pythonrun.c initialized to NULL, a
protective setter that will only allow setting if the pointer is NULL,
and Py_FatalError calling the pointer if pointer != Py_FatalError.

Should [Py_FatalError] fall through if the hook didn't terminate the
process to provide some level of warranty that the process will indeed
die?

Sound good?
-- 
Regards, James William Pye

From tdelaney at avaya.com  Wed May  4 04:30:55 2005
From: tdelaney at avaya.com (Delaney, Timothy C (Timothy))
Date: Wed, 4 May 2005 12:30:55 +1000
Subject: [Python-Dev] PEP 340 -- concept clarification
Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE025204DC@au3010avexu1.global.avaya.com>

Delaney, Timothy C (Timothy) wrote:

> Guido van Rossum wrote:
> 
>> I'd like the block statement to be defined exclusively in terms of
>> __exit__() though.
> 
> 1. If an iterator declares __exit__, it cannot be used in a for-loop.
>    For-loops do not guarantee resource cleanup.
> 
> 2. If an iterator does not declare __exit__, it cannot be used in a
> block-statement.
>    Block-statements guarantee resource cleanup.

Now some thoughts have solidified in my mind ... I'd like to define some
terminology that may be useful.

resource protocol:
    __next__
    __exit__

    Note: __iter__ is explicitly *not* required.

resource:
    An object that conforms to the resource protocol.

resource generator:
    A generator function that produces a resource.

resource usage statement/suite:
    A suite that uses a resource.

With this conceptual framework, I think the following makes sense:

- Keyword 'resource' for defining a resource generator.
- Keyword 'use' for using a resource.

e.g.

::

    resource locker (lock):
        lock.acquire()
        try:
            yield
        finally:
            lock.release()

    use locker(lock):
        # do stuff

Tim Delaney

From nbastin at opnet.com  Wed May  4 05:15:38 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Tue, 3 May 2005 23:15:38 -0400
Subject: [Python-Dev] Py_UNICODE madness
In-Reply-To: <ca471dc20505031544373d8c14@mail.gmail.com>
References: <ff0cffe6186fb31e1f3b5c9b3a252814@opnet.com>
	<ca471dc20505031544373d8c14@mail.gmail.com>
Message-ID: <8c00ac9db9a6be3e4a937eb6290c9838@opnet.com>


On May 3, 2005, at 6:44 PM, Guido van Rossum wrote:

> I think that documentation is wrong; AFAIK Py_UNICODE has always been
> allowed to be either 16 or 32 bits, and the source code goes through
> great lengths to make sure that you get a link error if you try to
> combine extensions built with different assumptions about its size.

That makes PyUnicode_FromUnicode() a lot less useful.  Well, really, 
not useful at all.

You might suggest that PyUnicode_FromWideChar is more useful, but 
that's only true on platforms that support wchar_t.

Is there no universally supported way of moving buffers of unicode data 
(as common data types, like unsigned short, etc.) into Python from C?

--
Nick


From gvanrossum at gmail.com  Wed May  4 05:42:11 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 3 May 2005 20:42:11 -0700
Subject: [Python-Dev] Py_UNICODE madness
In-Reply-To: <8c00ac9db9a6be3e4a937eb6290c9838@opnet.com>
References: <ff0cffe6186fb31e1f3b5c9b3a252814@opnet.com>
	<ca471dc20505031544373d8c14@mail.gmail.com>
	<8c00ac9db9a6be3e4a937eb6290c9838@opnet.com>
Message-ID: <ca471dc20505032042693dcd18@mail.gmail.com>

I really don't know. Effbot, MvL and/or MAL should know.

On 5/3/05, Nicholas Bastin <nbastin at opnet.com> wrote:
> 
> On May 3, 2005, at 6:44 PM, Guido van Rossum wrote:
> 
> > I think that documentation is wrong; AFAIK Py_UNICODE has always been
> > allowed to be either 16 or 32 bits, and the source code goes through
> > great lengths to make sure that you get a link error if you try to
> > combine extensions built with different assumptions about its size.
> 
> That makes PyUnicode_FromUnicode() a lot less useful.  Well, really,
> not useful at all.
> 
> You might suggest that PyUnicode_FromWideChar is more useful, but
> that's only true on platforms that support wchar_t.
> 
> Is there no universally supported way of moving buffers of unicode data
> (as common data types, like unsigned short, etc.) into Python from C?
> 
> --
> Nick
> 
> 


-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From greg.ewing at canterbury.ac.nz  Wed May  4 05:50:19 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 04 May 2005 15:50:19 +1200
Subject: [Python-Dev] 2 words keyword for block
In-Reply-To: <loom.20050503T214709-362@post.gmane.org>
References: <loom.20050503T214709-362@post.gmane.org>
Message-ID: <4278467B.1090404@canterbury.ac.nz>

Gheorghe Milas wrote:

> in template thread_safe(lock):
> in template redirected_stdout(stream):
> in template use_and_close_file(path) as file:
> in template as_transaction():
> in template auto_retry(times=3, failas=IOError):

-1. This is unpythonically verbose.

If I wanted to get lots of finger exercise typing redundant
keywords, I'd program in COBOL. :-)

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From reinhold-birkenfeld-nospam at wolke7.net  Wed May  4 10:09:11 2005
From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld)
Date: Wed, 04 May 2005 10:09:11 +0200
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
Message-ID: <d59vll$4qf$1@sea.gmane.org>

Hello,

after proposing this here (albeit deep in the PEP 340 thread) and
getting a somewhat affirmatory response from Guido, I have written
something that could become a PEP if sufficiently hatched...

---------------------------

PEP: XXX
Title: Unifying try-except and try-finally
Version: $Revision: $
Last-Modified: $Date: $
Author: Reinhold Birkenfeld <reinhold-birkenfeld-nospam at wolke7.net>
Status: Draft
Type: Standards Track
Content-Type: text/plain
Created: 04-May-2005
Post-History:


Abstract

    This PEP proposes a change in the syntax and semantics of try
    statements to allow combined try-except-finally blocks. This
    means in short that it would be valid to write

        try:
            <do something>
        except Exception:
            <handle the error>
        finally:
            <cleanup>


Rationale/Proposal

    There are many use cases for the try-except statement and
    for the try-finally statement per se; however, often one needs
    to catch exceptions and execute some cleanup code afterwards.
    It is slightly annoying and not very intelligible that
    one has to write

        f = None
        try:
            try:
                f = open(filename)
                text = f.read()
            except IOError:
                print 'An error occured'
        finally:
            if f:
                f.close()

    So it is proposed that a construction like this

        try:
            <suite 1>
        except Ex1:
            <suite 2>
        <more except: clauses>
        else:
            <suite 3>
        finally:
            <suite 4>

    be exactly the same as the legacy

        try:
            try:
                <suite 1>
            except Ex1:
                <suite 2>
            <more except: clauses>
            else:
                <suite 3>
        finally:
            <suite 4>

    This is backwards compatible, and every try statement that is
    legal today would continue to work.


Changes to the grammar

    The grammar for the try statement, which is currently

        try_stmt: ('try' ':' suite (except_clause ':' suite)+
                   ['else' ':' suite] | 'try' ':' suite 'finally' ':' suite)

    would have to become

        try_stmt: ('try' ':' suite (except_clause ':' suite)+
                   ['else' ':' suite] ['finally' ':' suite] |
                   'try' ':' suite (except_clause ':' suite)*
                   ['else' ':' suite] 'finally' ':' suite)


Implementation

    As the PEP author currently does not have sufficient knowledge
    of the CPython implementation, he is unfortunately not able
    to deliver one.


References

    None yet.


Copyright

    This document has been placed in the public domain.


-----------------------
Reinhold


-- 
Mail address is perfectly valid!


From mal at egenix.com  Wed May  4 10:39:16 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Wed, 04 May 2005 10:39:16 +0200
Subject: [Python-Dev] Py_UNICODE madness
In-Reply-To: <ff0cffe6186fb31e1f3b5c9b3a252814@opnet.com>
References: <ff0cffe6186fb31e1f3b5c9b3a252814@opnet.com>
Message-ID: <42788A34.30301@egenix.com>

Nicholas Bastin wrote:
> The documentation for Py_UNICODE states the following:
> 
> "This type represents a 16-bit unsigned storage type which is used by  
> Python internally as basis for holding Unicode ordinals. On platforms 
> where wchar_t is available and also has 16-bits,  Py_UNICODE is a 
> typedef alias for wchar_t to enhance  native platform compatibility. On 
> all other platforms,  Py_UNICODE is a typedef alias for unsigned 
> short."
> 
> However, we have found this not to be true on at least certain RedHat 
> versions (maybe all, but I'm not willing to say that at this point).  
> pyconfig.h on these systems reports that PY_UNICODE_TYPE is wchar_t, 
> and PY_UNICODE_SIZE is 4.  Needless to say, this isn't consistent with 
> the docs.  It also creates quite a few problems when attempting to 
> interface Python with other libraries which produce unicode data.
> 
> Is this a bug, or is this behaviour intended?

It's a documentation bug. The above was true in Python 2.0 and
still is for standard Python builds. The optional 32-bit support
was added later on (in Python 2.1 IIRC) and is only used if Python
is compiled with --enable-unicode=ucs4.

Unfortunately, RedHat and others have made the UCS4 build their
default which caused and is still causing lots of problems
with Python extensions shipped as binaries, e.g. RPMs or
other packages.

> It turns out that at some point in the past, this created problems for 
> tkinter as well, so someone just changed the internal unicode 
> representation in tkinter to be 4 bytes as well, rather than tracking 
> down the real source of the problem.
> 
> Is PY_UNICODE_TYPE always going to be guaranteed to be 16 bits, or is 
> it dependent on your platform? (in which case we can give up now on 
> Python unicode compatibility with any other libraries).  

Depends on the way Python was compiled.

> At the very 
> least, if we can't guarantee the internal representation, then the 
> PyUnicode_FromUnicode API needs to go away, and be replaced with 
> something capable of transcoding various unicode inputs into the 
> internal python representation.

We have PyUnicode_Decode() for that. PyUnicode_FromUnicode is
useful and meant for working directly on Py_UNICODE buffers.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 04 2005)
 >>> Python/Zope Consulting and Support ...        http://www.egenix.com/
 >>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
 >>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From p.f.moore at gmail.com  Wed May  4 10:57:45 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Wed, 4 May 2005 09:57:45 +0100
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <d58tkb$fvp$1@sea.gmane.org>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
Message-ID: <79990c6b050504015762d004ac@mail.gmail.com>

On 5/3/05, Nicolas Fleury <nidoizo at yahoo.com> wrote:
> We could avoid explaining to a newbie why the following code doesn't
> work if "opening" could be implemented in way that it works.
> 
> for filename in filenames:
>     block opening(filename) as file:
>         if someReason: break

My initial feeling was that this is a fairly major issue, not just
from an education POV, but also because it would be easy to imagine
someone converting

    f = open(name)
    ...
    close(f)

into

    opening(name) as f:
        ...

as part of a maintenance fix (an exception fails to close the file) or
tidy-up (upgrading code to conform to new best practices). But when I
tried to construct a plausible example, I couldn't find a case which
made real-life sense. For example, with Nicolas' original example:

    for name in filenames:
        opening(name) as f:
            if condition: break

I can't think of a reasonable condition which wouldn't involve reading
the file - which either involves an inner loop (and we already can't
break out of two loops, so the third one implied by the opening block
makes things no worse), or needs the whole file reading (which can be
done via f = open(); data = f.read(); f.close() and the opening block
doesn't actually help...)

So I believe the issue is less serious than I supposed at first - it's
certainly a teaching issue, but might not come up often enough in real
life to matter.

Oh, and by the way - I prefer the keywordless form of the block
statement (as used in my examples above). But it may exacerbate the
issue with break unless we have a really strong name for these
constructs ("break exits the innermost enclosing for, while, or um,
one of those things which nearly used the block keyword...") Actually,
maybe referring to them as "block statements", but using no keyword,
is perfectly acceptable. As I write, I'm finding it more and more
natural.

Paul.

From m.u.k.2 at gawab.com  Wed May  4 11:35:42 2005
From: m.u.k.2 at gawab.com (M.Utku K.)
Date: Wed, 4 May 2005 09:35:42 +0000 (UTC)
Subject: [Python-Dev] Need to hook Py_FatalError
References: <20050503112637.648F.JCARLSON@uci.edu>
	<Xns964BE20F692E9token@80.91.229.5>
	<20050503132639.6492.JCARLSON@uci.edu>
Message-ID: <Xns964C82A28CC9Btoken@80.91.229.5>

Hi,

Josiah Carlson <jcarlson at uci.edu> wrote in
news:20050503132639.6492.JCARLSON at uci.edu: 

>>> strip....
>> IMHO this should be left to hooker(apparerently not right word, but you
>> get the point :) ). If he allocates more mem. or does heavy stuff, that
>> will just fail. Anyway abort() is a failure too. Either abort() will
>> end the process or OS will on such a critical error.
> 
> I'm not talking about doing memory-intensive callbacks, I'm talking
> about the function call itself.
> 
>>From what I understand, any function call in Python requires a memory
> allocation. This is trivially true in the case of rentrant Python calls;
> which requires the allocation of a frame object from heap memory, and in
> the case of all calls, from C stack memory. If you cannot allocate a
> frame for __del__ method calling (one of the error conditions), you
> certainly aren't going to be able to call a Python callback (no heap
> memory), and may not have enough stack memory required by your logging
> function; even if it is written in C (especially if you construct a
> nontrivial portion of the message in memory before it is printed).
> 
> If I'm wrong, I'd like to hear it, but I'm still waiting for your patch
> on sourceforge.
>  - Josiah


Wait a minute I guess I wasn't clear on that: The callback will be only in C 
level smtg like "PySetFatalError_CallBack" , there will be no way to hook it 
from Python because as you said Python may have crashed hard like "Can't 
initialize type".

Best regards.


From m.u.k.2 at gawab.com  Wed May  4 11:46:14 2005
From: m.u.k.2 at gawab.com (M.Utku K.)
Date: Wed, 4 May 2005 09:46:14 +0000 (UTC)
Subject: [Python-Dev] Need to hook Py_FatalError
References: <20050503112637.648F.JCARLSON@uci.edu>
	<Xns964BE20F692E9token@80.91.229.5>
	<20050503132639.6492.JCARLSON@uci.edu>
	<1115171686.62180.48.camel@localhost>
Message-ID: <Xns964C846BCF835token@80.91.229.5>

Hi,

James William Pye <python at jwp.name> wrote in 
news:1115171686.62180.48.camel at localhost:

> On Tue, 2005-05-03 at 13:39 -0700, Josiah Carlson wrote:
> 
>> If I'm wrong, I'd like to hear it, but I'm still waiting for your patch
>> on sourceforge.
> 
> Well, if he lost/loses interest for whatever reason, I'd be willing to
> provide.
> 
> Although, if m.u.k. is going to write it, please be sure to include a
> CPP macro/define, so that embedders could recognize the feature without
> having to run explicit checks or do version based fingerprinting. (I'd
> be interested to follow the patch if you(muk) put it up!)
> 
> Hrm, although, I don't think it would be wise to allow extension modules
> to set this. IMO, there should be some attempt to protect it; ie, once
> it's initialized, don't allow reinitialization, as if the embedder is
> handling it, it should be handled through the duration of the process.
> So, a static function pointer in pythonrun.c initialized to NULL, a
> protective setter that will only allow setting if the pointer is NULL,
> and Py_FatalError calling the pointer if pointer != Py_FatalError.
> 
> Should [Py_FatalError] fall through if the hook didn't terminate the
> process to provide some level of warranty that the process will indeed
> die?
> 
> Sound good?


I haven't lost interest, I expect to publish at most in a couple of days at 
SourceForge. 

The reinit. issue: The old way of returning old callback when a new 
callback is set sounds OK. Or better way: there may be an array to hold all 
the callbacks, Py_FatalError iterates and call each.

M. Utku Karatas

Best regards.


From ncoghlan at gmail.com  Wed May  4 12:25:32 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 04 May 2005 20:25:32 +1000
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <8dffe7626eaf9a89812c2828bcf96efe@fuhm.net>
References: <2mhdhkz8aw.fsf@starship.python.net>	<000b01c54ff6$a0fb4ca0$c704a044@oemcomputer>	<ca471dc2050503095336305c8b@mail.gmail.com>
	<8dffe7626eaf9a89812c2828bcf96efe@fuhm.net>
Message-ID: <4278A31C.5000001@gmail.com>

James Y Knight wrote:
> On May 3, 2005, at 12:53 PM, Guido van Rossum wrote:
> 
>>def saving_stdout(f):
>>    save_stdout = sys.stdout
>>    try:
>>        sys.stdout = f
>>        yield
>>    finally:
>>        sys.stdout = save_stdout
> 
> 
> I hope you aren't going to be using that in any threaded program. 

sys.stdout is a global - threading issues are inherent in monkeying with it. At 
least this approach allows all code that redirects stdout to be easily serialised:

def redirect_stdout(f, the_lock=Lock()):
     locking(the_lock):
         save_stdout = sys.stdout
         try:
             sys.stdout = f
             yield
         finally:
             sys.stdout = save_stdout

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From ncoghlan at gmail.com  Wed May  4 12:42:19 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 04 May 2005 20:42:19 +1000
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <1f7befae05050313212db5d4df@mail.gmail.com>
References: <ca471dc2050503095336305c8b@mail.gmail.com>	<001101c5500d$a7be7140$c704a044@oemcomputer>	<ca471dc205050311481385b64f@mail.gmail.com>	<1f7befae050503121346833d97@mail.gmail.com>	<ca471dc205050312485bde01fe@mail.gmail.com>
	<1f7befae05050313212db5d4df@mail.gmail.com>
Message-ID: <4278A70B.8040804@gmail.com>

Tim Peters wrote:
> I don't think anyone has mentioned this yet, so I will:  library
> writers using Decimal (or more generally HW 754 gimmicks) have a need
> to fiddle lots of thread-local state ("numeric context"), and must
> restore it no matter how the routine exits.  Like "boost precision to
> twice the user's value over the next 12 computations, then restore",
> and "no matter what happens here, restore the incoming value of the
> overflow-happened flag".  It's just another instance of temporarily
> taking over a shared resource, but I think it's worth mentioning that
> there are a lot of things "like that" in the world, and to which
> decorators don't really sanely apply.

To turn this example into PEP 340 based code:

     # A template to be provided by the decimal module
     # Context is thread-local, so there is no threading problem
     def in_context(context):
         old_context = getcontext()
         try:
             setcontext(context)
             yield
         finally:
             setcontext(old_context)

Used as follows:

     block decimal.in_context(Context(prec=12)):
         # Perform higher precision operations here

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From fredrik at pythonware.com  Wed May  4 13:08:56 2005
From: fredrik at pythonware.com (Fredrik Lundh)
Date: Wed, 4 May 2005 13:08:56 +0200
Subject: [Python-Dev] anonymous blocks
References: <ca471dc205042116402d7d38da@mail.gmail.com><ca471dc205042416572da9db71@mail.gmail.com><426DB7C8.5020708@canterbury.ac.nz><ca471dc2050426043713116248@mail.gmail.com><426E3B01.1010007@canterbury.ac.nz><ca471dc205042621472b1f6edf@mail.gmail.com><ca471dc20504270030405f922f@mail.gmail.com><5.1.1.6.0.20050427105524.02479e70@mail.telecommunity.com><ca471dc205042713277846852d@mail.gmail.com><5.1.1.6.0.20050427164323.0332c2b0@mail.telecommunity.com>
	<ca471dc2050427145022e8985f@mail.gmail.com>
Message-ID: <d5aa42$8it$1@sea.gmane.org>

Guido van Rossum wrote:

> Fredrik, what does your intuition tell you?

having been busy with other stuff for nearly a week, and seeing that the PEP is now at
version 1.22, my intuition tells me that it's time to read the PEP again before I have any
opinion on anything ;-)

</F> 




From ncoghlan at gmail.com  Wed May  4 13:33:25 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 04 May 2005 21:33:25 +1000
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <5.1.1.6.0.20050503130527.02471ad0@mail.telecommunity.com>
References: <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer>	<2mhdhkz8aw.fsf@starship.python.net>	<000b01c54ff6$a0fb4ca0$c704a044@oemcomputer>
	<5.1.1.6.0.20050503130527.02471ad0@mail.telecommunity.com>
Message-ID: <4278B305.6040704@gmail.com>

Phillip J. Eby wrote:
> This and other examples from the PEP still have a certain awkwardness of 
> phrasing in their names.  A lot of them seem to cry out for a "with" 
> prefix, although maybe that's part of the heritage of PEP 310.  But Lisp 
> has functions like 'with-open-file', so I don't think that it's *all* a PEP 
> 310 influence on the examples.

I've written up a few examples in the course of the discussion, and the more of 
them I have written, the more the keywordless syntax has grown on me.

No meaningful name like 'with' or 'in' is appropriate for all possible block 
iterators, which leaves only keyword-for-the-sake-of-a-keyword options like 
'block' or 'suite'. With block statements viewed as user-defined blocks, leaving 
the keyword out lets the block iterator be named whatever is appropriate to 
making the block statement read well. If a leading 'with' is needed, just 
include it in the name.

That is, instead of a 'block statement with the locking block iterator', you 
write a 'locking statement'. Instead of a block statement with the opening block 
iterator', you write an 'opening statement'.

The benefit didn't stand out for me until writing examples with real code around 
the start of the block statement. Unlike existing statements, the keyword is 
essentially irrelevant in understanding the implications of the statement - the 
important thing is the block iterator being used. That is hard to see when the 
keyword is the only thing dedented from the contained suite.

Consider some of the use cases from the PEP, but put inside function definitions 
to make it harder to pick out the name of the block iterator:

   def my_func():
       block locking(the_lock):
           do_some_operation_while_holding_the_lock()

Versus:

   def my_func():
       locking(the_lock):
           do_some_operation_while_holding_the_lock()

And:

   def my_func(filename):
       block opening(filename) as f:
           for line in f:
               print f

Versus:

   def my_func(filename):
       opening(filename) as f:
           for line in f:
               print f


And a few more without the contrast:

   def my_func():
       do_transaction():
           db.delete_everything()


   def my_func():
       auto_retry(3, IOError):
           f = urllib.urlopen("http://python.org/peps/pep-0340.html")
           print f.read()

   def my_func():
       opening(filename, "w") as f:
           with_stdout(f):
               print "Hello world"


When Guido last suggested this, the main concern seemed to be that the 
documentation for every block iterator would need to explain the semantics of 
block statements, since the block iterator name is the only name to be looked up 
in the documentation. But they don't need to explain the general semantics, they 
only need to explain _their_ semantics, and possibly provide a pointer to the 
general block statement documentation. That is, explain _what_ the construct 
does (which is generally straightforward), not _how_ it does it (which is 
potentially confusing).

E.g.

   def locking(the_lock):
       """Executes the following nested block while holding the supplied lock

          Ensures the lock is acquired before entering the block and
          released when the block is exited (including via exceptions
          or return statements).
          If None is supplied as the argument, no locking occurs.
       """
       if the_lock is None:
           yield
       else:
           the_lock.acquire()
           try:
               yield
           finally:
               the_lock.release()

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From ncoghlan at gmail.com  Wed May  4 13:58:29 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 04 May 2005 21:58:29 +1000
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <5.1.1.6.0.20050503202653.02fa28f0@mail.telecommunity.com>
References: <5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com>	<ca471dc205050311481385b64f@mail.gmail.com>	<001901c55027$50c101e0$c704a044@oemcomputer>	<5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com>	<ca471dc20505031533ab2da74@mail.gmail.com>	<5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com>	<ca471dc2050503164127505290@mail.gmail.com>	<5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com>
	<5.1.1.6.0.20050503202653.02fa28f0@mail.telecommunity.com>
Message-ID: <4278B8E5.7060303@gmail.com>

> At 05:17 PM 5/3/05 -0700, Guido van Rossum wrote:
>>But I kind of doubt that it's an issue; you'd have to have a
>>try/except catching StopIteration around a yield statement that
>>resumes the generator before this becomes an issue, and that sounds
>>extremely improbable.

The specific offending construct is:

   yield itr.next()

Wrapping that in StopIteration can be quite convenient, and is probably too 
common to ignore - Phillip's examples reminded me that some my _own_ code uses 
this trick.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From ncoghlan at gmail.com  Wed May  4 14:32:01 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 04 May 2005 22:32:01 +1000
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <338366A6D2E2CA4C9DAEAE652E12A1DE721278@au3010avexu1.global.avaya.com>
References: <338366A6D2E2CA4C9DAEAE652E12A1DE721278@au3010avexu1.global.avaya.com>
Message-ID: <4278C0C1.4040201@gmail.com>

Delaney, Timothy C (Timothy) wrote:
> Guido van Rossum wrote:
> 
> 
>>I'd like the block statement to be defined exclusively in terms of
>>__exit__() though.
> 
> 
> This does actually suggest something to me (note - just a thought - no
> real idea if it's got any merit).
> 
> Are there any use cases proposed for the block-statement (excluding the
> for-loop) that do *not* involve resource cleanup (i.e. need an
> __exit__)?
> 
> This could be the distinguishing feature between for-loops and
> block-statements:
> 
> 1. If an iterator declares __exit__, it cannot be used in a for-loop.
>    For-loops do not guarantee resource cleanup.
> 
> 2. If an iterator does not declare __exit__, it cannot be used in a
> block-statement.
>    Block-statements guarantee resource cleanup.
> 
> This gives separation of API (and thus purpose) whilst maintaining the
> simplicity of the concept. Unfortunately, generators then become a pain
> :( We would need additional syntax to declare that a generator was a
> block generator.

Ah, someone else did post this idea first :)

To deal with the generator issue, one option would be to follow up on Phillip's 
idea of a decorator to convert a generator (or perhaps any standard iterator) 
into a block iterator.

I think this would also do wonders for emphasising the difference between for 
loops and block statements.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From ncoghlan at gmail.com  Wed May  4 16:10:31 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 05 May 2005 00:10:31 +1000
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <79990c6b050504015762d004ac@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org>
	<427797D5.8030207@cirad.fr>	<17015.39213.522060.873605@montanaro.dyndns.org>	<ca471dc20505031013287a2e92@mail.gmail.com>	<17015.48830.223391.390538@montanaro.dyndns.org>	<ca471dc2050503113127f938b0@mail.gmail.com>	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
Message-ID: <4278D7D7.2040805@gmail.com>

Paul Moore wrote:
> Oh, and by the way - I prefer the keywordless form of the block
> statement (as used in my examples above). But it may exacerbate the
> issue with break unless we have a really strong name for these
> constructs

"may" exacerbate? Something of an understatement, unfortunately. 'break' and 
'continue' are going to be useless in most block statements, and in most of the 
cases where that is true, one would expect them to break out of a surrounding loop.

Reconsidering the non-looping semantics I discussed way back at the start of 
this exercise, this is what using auto_retry would look like with a single-pass 
block statement:

   for attempt in auto_retry(3, IOError):
       attempt:
           # Do something!
           # Including break and continue to get on with the next attempt!

(Now that's what I call a user defined statement - we're iterating over a list 
of them!)

Anyway, auto_retry and the block statements it returns could be implemented as:

   def suppressing(exc=Exception, on_success=None):
       """Suppresses the specified exception in the following block"""
       try:
           yield
       except exc:
           pass
       else:
           if on_success is not None:
                on_success()

   def just_do_it():
       """Simply executes the following block"""
       yield

   def auto_retry(times, exc=Exception):
       """Generates the specified number of attempts"""
       class cb():
           succeeded = False
           def __init__(self):
               cb.succeeded = True

       for i in xrange(times-1):
           yield suppressing(exc, cb)
           if cb.succeeded:
               break
       else:
           yield just_do_it()


(Prettier than my last attempt at writing this, but still not very pretty. 
However, I'm willing to trade the need for that callback in the implementation 
of auto_retry to get non-surprising behaviour from break and continue, as the 
latter is visible to the majority of users, but the former is not)

Note that the code above works, even *if* the block statement is a looping 
construct, making a mess out of TOOWTDI.

Making it single pass also simplifies the semantics of the block statement 
(using VAR1 and EXPR1 from PEP 340):

     finalised = False
     block_itr = EXPR1
     try:
         try:
             VAR1 = block_itr.next()
         except StopIteration:
             # Can still choose not to run the block at all
             finalised = True
         except:
             # There was an exception. Handle it or just reraise it.
             finalised = True
             exc = sys.exc_info()
             ext = getattr(block_itr, "__exit__", None)
             if ext is not None:
                 ext(*exc)   # May re-raise *exc
             else:
                 raise *exc   # Well, the moral equivalent :-)
     finally:
         if not finalised:
             # The block finished cleanly, or exited via
             # break, return or continue. Clean up the iterator.
             ext = getattr(block_itr, "__exit__", None)
             if ext is not None:
                 try:
                     ext(StopIteration)
                 except StopIteration:
                     pass


With single-pass semantics, an iterator used in a block statement would have 
it's .next() method called exactly once, and it's __exit__ method called exactly 
once if the call to .next() does not raise StopIteration. And there's no need to 
mess with the meaning of break, return or continue - they behave as usual, 
affecting the surrounding scope rather than the block statement.

The only new thing needed is an __exit__ method on generators (and the block 
syntax itself, of course).

Looks like I've come full circle, and am back to arguing for semantics closer to 
those in PEP 310. But I have a better reason now :)

> Actually,
> maybe referring to them as "block statements", but using no keyword,
> is perfectly acceptable. As I write, I'm finding it more and more
> natural.

Same here. Especially if the semantics are tweaked so that it *is* a 
straightforward statement instead of a loop.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From nbastin at opnet.com  Wed May  4 16:14:31 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Wed, 4 May 2005 10:14:31 -0400
Subject: [Python-Dev] Py_UNICODE madness
In-Reply-To: <42788A34.30301@egenix.com>
References: <ff0cffe6186fb31e1f3b5c9b3a252814@opnet.com>
	<42788A34.30301@egenix.com>
Message-ID: <f8b31d0345f1ac32e5f683889588005d@opnet.com>


On May 4, 2005, at 4:39 AM, M.-A. Lemburg wrote:
>> At the very least, if we can't guarantee the internal representation, 
>> then the PyUnicode_FromUnicode API needs to go away, and be replaced 
>> with something capable of transcoding various unicode inputs into the 
>> internal python representation.
>
> We have PyUnicode_Decode() for that. PyUnicode_FromUnicode is
> useful and meant for working directly on Py_UNICODE buffers.

Is this API documented anywhere?  (It's not in the Unicode Object 
section of the API doc).  Also, this is quite inefficient if the source 
data is in UTF-16, because it appears that I'll have to transcode my 
data to utf-8 before I can pass it to this function, but I guess I'll 
have to live with that.

--
Nick


From pedronis at strakt.com  Wed May  4 16:20:04 2005
From: pedronis at strakt.com (Samuele Pedroni)
Date: Wed, 04 May 2005 16:20:04 +0200
Subject: [Python-Dev] Python Language track at Europython,
 still possibilities to submit talks
Message-ID: <4278DA14.8030301@strakt.com>

I'm the track chair of the Python Language track at Europython (27-29 
June, G?teborg, Sweden) . The general deadlline for talk submission has 
been extended until the 7th of May.

There are still open slots for the language track. So if someone with 
(core) language interests is or may be interested in partecipating, 
there's still the possibility to submit
talks about idioms, patterns, recent new additions to language (for 
example new 2.4 features), or other language related topics.

http://www.europython.org/sections/tracks_and_talks/propose_a_talk/#language
http://www.europython.org/sections/tracks_and_talks/propose_a_talk/
http://www.europython.org

Regards,

Samuele Pedroni, Python Language Track chair.


From python at jwp.name  Wed May  4 16:58:54 2005
From: python at jwp.name (James William Pye)
Date: Wed, 04 May 2005 07:58:54 -0700
Subject: [Python-Dev] Need to hook Py_FatalError
In-Reply-To: <Xns964C846BCF835token@80.91.229.5>
References: <20050503112637.648F.JCARLSON@uci.edu>
	<Xns964BE20F692E9token@80.91.229.5>
	<20050503132639.6492.JCARLSON@uci.edu>
	<1115171686.62180.48.camel@localhost>
	<Xns964C846BCF835token@80.91.229.5>
Message-ID: <1115218734.62180.119.camel@localhost>

On Wed, 2005-05-04 at 09:46 +0000, M.Utku K. wrote:
> The reinit. issue: The old way of returning old callback when a new 
> callback is set sounds OK. Or better way: there may be an array to hold all 
> the callbacks, Py_FatalError iterates and call each.

Why should reinitialization be allowed at all? Seems to me that this
feature should be exclusively reserved for an embedding application to
handle the fatal in an application specific way; ie ereport(FATAL,()) in
PostgreSQL, which quickly exits after some cleanup. Why should an
extension module be allowed to set this, or reset it?
-- 
Regards, James William Pye

From aleaxit at yahoo.com  Wed May  4 17:14:54 2005
From: aleaxit at yahoo.com (Alex Martelli)
Date: Wed, 4 May 2005 08:14:54 -0700
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <79990c6b050504015762d004ac@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
Message-ID: <f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>


On May 4, 2005, at 01:57, Paul Moore wrote:

> tried to construct a plausible example, I couldn't find a case which
> made real-life sense. For example, with Nicolas' original example:
>
>     for name in filenames:
>         opening(name) as f:
>             if condition: break
>
> I can't think of a reasonable condition which wouldn't involve reading
> the file - which either involves an inner loop (and we already can't
> break out of two loops, so the third one implied by the opening block
> makes things no worse), or needs the whole file reading (which can be

Looking for a file with a certain magicnumber in its 1st two bytes...?

for name in filenames:
     opening(name) as f:
         if f.read(2) == 0xFEB0: break

This does seem to make real-life sense to me...


Alex


From p.f.moore at gmail.com  Wed May  4 17:27:48 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Wed, 4 May 2005 16:27:48 +0100
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
Message-ID: <79990c6b0505040827941ff0@mail.gmail.com>

On 5/4/05, Alex Martelli <aleaxit at yahoo.com> wrote:
> 
> On May 4, 2005, at 01:57, Paul Moore wrote:
> >
> > I can't think of a reasonable condition which wouldn't involve reading
> > the file - which either involves an inner loop (and we already can't
> > break out of two loops, so the third one implied by the opening block
> > makes things no worse), or needs the whole file reading (which can be
> 
> Looking for a file with a certain magicnumber in its 1st two bytes...?
> 
> for name in filenames:
>     opening(name) as f:
>         if f.read(2) == 0xFEB0: break
> 
> This does seem to make real-life sense to me...

Yes, that'd do. I can't say I think it would be common, but it's a
valid case. And the workaround is the usual messy flag variable:

for name in filenames:
    found = False
    opening(name) as f:
        if f.read(2) == 0xFEB0: found = True
    if found: break

Yuk.

Paul.

From steven.bethard at gmail.com  Wed May  4 17:35:18 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Wed, 4 May 2005 09:35:18 -0600
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <4278D7D7.2040805@gmail.com>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<4278D7D7.2040805@gmail.com>
Message-ID: <d11dcfba050504083551bb0a1e@mail.gmail.com>

On 5/4/05, Nick Coghlan <ncoghlan at gmail.com> wrote:
> With single-pass semantics, an iterator used in a block statement would have
> it's .next() method called exactly once, and it's __exit__ method called exactly
> once if the call to .next() does not raise StopIteration. And there's no need to
> mess with the meaning of break, return or continue - they behave as usual,
> affecting the surrounding scope rather than the block statement.
> 
> The only new thing needed is an __exit__ method on generators (and the block
> syntax itself, of course).

Makes me wonder if we shouldn't just return to the __enter__() and
__exit__() names of PEP 310[1] where for a generator __enter__() is
just an alias for next().  We could even require Phillip J. Eby's
"blockgenerator" decorator to rename next() to __enter__(), and add
the appropriate __exit__() method.  Something like:

    class blockgen(object):
        def __init__(self, gen):
            self.gen = gen
        def __enter__(self):
            self.gen.next()
        def __exit__(self):
            # cause finally blocks to be executed
    
    def blockgenerator(genfunc):
        def getblockgen(*args, **kwargs):
            return blockgen(genfunc(*args, **kwargs))
        return getblockgen

to be used like:

    @blockgenerator
    def locking(lock):
        lock.acquire()
        try:
            yield
        finally:
            lock.release()

'Course, it might be even nicer if try/finally around a yield could
only be used with block generators...  To get a syntax error, we'd
have to replace the decorator with a new syntax, e.g. Tim Delaney's
"resource" instead of "def" syntax or maybe using something like
"blockyield" or "resourceyield" instead of "yield" (though these are
probably too long)...

Steve

[1]http://www.python.org/peps/pep-0310.html
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From m.u.k.2 at gawab.com  Wed May  4 17:29:33 2005
From: m.u.k.2 at gawab.com (M.Utku K.)
Date: Wed, 4 May 2005 15:29:33 +0000 (UTC)
Subject: [Python-Dev] Need to hook Py_FatalError
References: <20050503112637.648F.JCARLSON@uci.edu>
	<Xns964BE20F692E9token@80.91.229.5>
	<20050503132639.6492.JCARLSON@uci.edu>
	<1115171686.62180.48.camel@localhost>
	<Xns964C846BCF835token@80.91.229.5>
	<1115218734.62180.119.camel@localhost>
Message-ID: <Xns964CBEA1EFA75token@80.91.229.5>

James William Pye <python at jwp.name> wrote in
news:1115218734.62180.119.camel at localhost: 

> On Wed, 2005-05-04 at 09:46 +0000, M.Utku K. wrote:
>> The reinit. issue: The old way of returning old callback when a new 
>> callback is set sounds OK. Or better way: there may be an array to hold
>> all the callbacks, Py_FatalError iterates and call each.
> 
> Why should reinitialization be allowed at all? Seems to me that this
> feature should be exclusively reserved for an embedding application to
> handle the fatal in an application specific way; ie ereport(FATAL,()) in
> PostgreSQL, which quickly exits after some cleanup. Why should an
> extension module be allowed to set this, or reset it?

What if more than one extension needs it ?
Curently Im doing

callback_type SetCallBack(callback_type newfunc)

This will set the callback to newfunc and return the old one. Extension 
developer may discard or call them at his own will. What do you think?


From mal at egenix.com  Wed May  4 19:39:55 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Wed, 04 May 2005 19:39:55 +0200
Subject: [Python-Dev] Py_UNICODE madness
In-Reply-To: <f8b31d0345f1ac32e5f683889588005d@opnet.com>
References: <ff0cffe6186fb31e1f3b5c9b3a252814@opnet.com>	<42788A34.30301@egenix.com>
	<f8b31d0345f1ac32e5f683889588005d@opnet.com>
Message-ID: <427908EB.2050904@egenix.com>

Nicholas Bastin wrote:
> 
> On May 4, 2005, at 4:39 AM, M.-A. Lemburg wrote:
> 
>>> At the very least, if we can't guarantee the internal representation, 
>>> then the PyUnicode_FromUnicode API needs to go away, and be replaced 
>>> with something capable of transcoding various unicode inputs into the 
>>> internal python representation.
>>
>>
>> We have PyUnicode_Decode() for that. PyUnicode_FromUnicode is
>> useful and meant for working directly on Py_UNICODE buffers.
> 
> 
> Is this API documented anywhere?  (It's not in the Unicode Object 
> section of the API doc).  Also, this is quite inefficient if the source 
> data is in UTF-16, because it appears that I'll have to transcode my 
> data to utf-8 before I can pass it to this function, but I guess I'll 
> have to live with that.

Not at all. You pass in the pointer, the function does the rest:

http://docs.python.org/api/builtinCodecs.html

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 04 2005)
 >>> Python/Zope Consulting and Support ...        http://www.egenix.com/
 >>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
 >>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From nbastin at opnet.com  Wed May  4 17:54:32 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Wed, 4 May 2005 11:54:32 -0400
Subject: [Python-Dev] Py_UNICODE madness
In-Reply-To: <427908EB.2050904@egenix.com>
References: <ff0cffe6186fb31e1f3b5c9b3a252814@opnet.com>	<42788A34.30301@egenix.com>
	<f8b31d0345f1ac32e5f683889588005d@opnet.com>
	<427908EB.2050904@egenix.com>
Message-ID: <27a821b5583891d3ea4dfa62f0eafeb8@opnet.com>


On May 4, 2005, at 1:39 PM, M.-A. Lemburg wrote:

> Nicholas Bastin wrote:
>> On May 4, 2005, at 4:39 AM, M.-A. Lemburg wrote:
>>>> At the very least, if we can't guarantee the internal 
>>>> representation, then the PyUnicode_FromUnicode API needs to go 
>>>> away, and be replaced with something capable of transcoding various 
>>>> unicode inputs into the internal python representation.
>>>
>>>
>>> We have PyUnicode_Decode() for that. PyUnicode_FromUnicode is
>>> useful and meant for working directly on Py_UNICODE buffers.
>> Is this API documented anywhere?  (It's not in the Unicode Object 
>> section of the API doc).  Also, this is quite inefficient if the 
>> source data is in UTF-16, because it appears that I'll have to 
>> transcode my data to utf-8 before I can pass it to this function, but 
>> I guess I'll have to live with that.
>
> Not at all. You pass in the pointer, the function does the rest:

Ah, I missed the codec registry lookup.  Thanks.

I'll change the Py_UNICODE doc, if anyone has a suggestion as to what 
to change it *to*...

--
Nick


From nbastin at opnet.com  Wed May  4 17:59:40 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Wed, 4 May 2005 11:59:40 -0400
Subject: [Python-Dev] New Py_UNICODE doc
Message-ID: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>

The current documentation for Py_UNICODE states:

"This type represents a 16-bit unsigned storage type which is used by  
Python internally as basis for holding Unicode ordinals. On  platforms 
where wchar_t is available and also has 16-bits,  Py_UNICODE is a 
typedef alias for wchar_t to enhance  native platform compatibility. On 
all other platforms,  Py_UNICODE is a typedef alias for unsigned 
short."

I propose changing this to:

"This type represents the storage type which is used by Python 
internally as the basis for holding Unicode ordinals.  On platforms 
where wchar_t is available, Py_UNICODE is a typedef alias for wchar_t 
to enhance native platform compatibility.  On all other platforms, 
Py_UNICODE is a typedef alias for unsigned short.  Extension module 
developers should make no assumptions about the size of this type on 
any given platform."

If no one has a problem with that, I'll make the change in CVS.

--
Nick


From theller at python.net  Wed May  4 18:08:54 2005
From: theller at python.net (Thomas Heller)
Date: Wed, 04 May 2005 18:08:54 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com> (Nicholas Bastin's
	message of "Wed, 4 May 2005 11:59:40 -0400")
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>
Message-ID: <4qdjm0rt.fsf@python.net>

Nicholas Bastin <nbastin at opnet.com> writes:

> The current documentation for Py_UNICODE states:
>
> "This type represents a 16-bit unsigned storage type which is used by  
> Python internally as basis for holding Unicode ordinals. On  platforms 
> where wchar_t is available and also has 16-bits,  Py_UNICODE is a 
> typedef alias for wchar_t to enhance  native platform compatibility. On 
> all other platforms,  Py_UNICODE is a typedef alias for unsigned 
> short."
>
> I propose changing this to:
>
> "This type represents the storage type which is used by Python 
> internally as the basis for holding Unicode ordinals.  On platforms 
> where wchar_t is available, Py_UNICODE is a typedef alias for wchar_t 
> to enhance native platform compatibility.  On all other platforms, 
> Py_UNICODE is a typedef alias for unsigned short.  Extension module 
> developers should make no assumptions about the size of this type on 
> any given platform."
>
> If no one has a problem with that, I'll make the change in CVS.

AFAIK, you can configure Python to use 16-bits or 32-bits Unicode chars,
independend from the size of wchar_t.  The HAVE_USABLE_WCHAR_T macro can
be used by extension writers to determine if Py_UNICODE is the same as
wchar_t.  At least that's my understanding, so the above seems still
wrong.  And +1 for trying to clean up this confusion.

Thomas


From jepler at unpythonic.net  Wed May  4 18:12:26 2005
From: jepler at unpythonic.net (Jeff Epler)
Date: Wed, 4 May 2005 11:12:26 -0500
Subject: [Python-Dev] Need to hook Py_FatalError
In-Reply-To: <Xns964CBEA1EFA75token@80.91.229.5>
References: <20050503112637.648F.JCARLSON@uci.edu>
	<Xns964BE20F692E9token@80.91.229.5>
	<20050503132639.6492.JCARLSON@uci.edu>
	<1115171686.62180.48.camel@localhost>
	<Xns964C846BCF835token@80.91.229.5>
	<1115218734.62180.119.camel@localhost>
	<Xns964CBEA1EFA75token@80.91.229.5>
Message-ID: <20050504161226.GD14737@unpythonic.net>

On Wed, May 04, 2005 at 03:29:33PM +0000, M.Utku K. wrote:
> James William Pye <python at jwp.name> wrote in
> news:1115218734.62180.119.camel at localhost: 
> > Why should reinitialization be allowed at all? Seems to me that this
> > feature should be exclusively reserved for an embedding application to
> > handle the fatal in an application specific way; ie ereport(FATAL,()) in
> > PostgreSQL, which quickly exits after some cleanup. Why should an
> > extension module be allowed to set this, or reset it?
> 
> What if more than one extension needs it ?

I agree with James; As I imagine this feature, it is for programs that
embed Python, not for extensions.  Whether the hook would be written to
prevent this from being done, or whether it would just be documented as
"for embedders only", I don't care.

In my own application, I didn't use a setter function, I just created a
new global variable.  This works fine for me.  It doesn't prevent the
(abusive, in my view) hooking of the error handler by any old extension,
but since my application doesn't currently import shared modules it
doesn't matter.

--- /tmp/Python-2.3/Python/pythonrun.c	2003-07-15 20:54:38.000000000 -0500
+++ ./pythonrun.c	2005-04-11 13:32:39.000000000 -0500
@@ -1435,9 +1435,14 @@
 
 /* Print fatal error message and abort */
 
+void (*Py_FatalErrorHandler)(const char *msg) = NULL;
 void
 Py_FatalError(const char *msg)
 {
+        if(Py_FatalErrorHandler != NULL) { 
+                Py_FatalErrorHandler(msg);
+                fprintf(stderr, "PyFatalErrorHandler returned\n");
+        }
 	fprintf(stderr, "Fatal Python error: %s\n", msg);
 #ifdef MS_WINDOWS
 	OutputDebugString("Fatal Python error: ");
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: not available
Url : http://mail.python.org/pipermail/python-dev/attachments/20050504/3c3b1daf/attachment.pgp

From reinhold-birkenfeld-nospam at wolke7.net  Wed May  4 18:23:03 2005
From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld)
Date: Wed, 04 May 2005 18:23:03 +0200
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <79990c6b0505040827941ff0@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org>
	<427797D5.8030207@cirad.fr>	<17015.39213.522060.873605@montanaro.dyndns.org>	<ca471dc20505031013287a2e92@mail.gmail.com>	<17015.48830.223391.390538@montanaro.dyndns.org>	<ca471dc2050503113127f938b0@mail.gmail.com>	<d58tkb$fvp$1@sea.gmane.org>	<79990c6b050504015762d004ac@mail.gmail.com>	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
	<79990c6b0505040827941ff0@mail.gmail.com>
Message-ID: <d5asjh$djm$1@sea.gmane.org>

Paul Moore wrote:
> On 5/4/05, Alex Martelli <aleaxit at yahoo.com> wrote:
>> 
>> On May 4, 2005, at 01:57, Paul Moore wrote:
>> >
>> > I can't think of a reasonable condition which wouldn't involve reading
>> > the file - which either involves an inner loop (and we already can't
>> > break out of two loops, so the third one implied by the opening block
>> > makes things no worse), or needs the whole file reading (which can be
>> 
>> Looking for a file with a certain magicnumber in its 1st two bytes...?
>> 
>> for name in filenames:
>>     opening(name) as f:
>>         if f.read(2) == 0xFEB0: break
>> 
>> This does seem to make real-life sense to me...
> 
> Yes, that'd do. I can't say I think it would be common, but it's a
> valid case. And the workaround is the usual messy flag variable:
> 
> for name in filenames:
>     found = False
>     opening(name) as f:
>         if f.read(2) == 0xFEB0: found = True
>     if found: break

Is there anything we could do about this?

Reinhold

-- 
Mail address is perfectly valid!


From adamsz at gmail.com  Wed May  4 18:41:02 2005
From: adamsz at gmail.com (Adam Souzis)
Date: Wed, 4 May 2005 09:41:02 -0700
Subject: [Python-Dev] "begin" as keyword for pep 340
Message-ID: <d7cee4a7050504094122979f5a@mail.gmail.com>

I'm a bit surpised that no one has yet [1] suggested "begin" as a
keyword instead "block" as it seems to express the intent of blocks
and is concise and readable.  For example, here are the examples in
PEP 340 rewritten using "begin":

begin locking():
   ...

begin opening(path) as f: #how about: begin using_file(path) as f:
   ...

begin transaction(db):
  ... 

begin auto_retry(3):
   ...

begin redirecting_stdout:
  ....

Probably the biggest problem with "begin" is that it is relatively
common as an identify. For example, Greping through Python's Lib
directory, begin is used as a method name twice (in httplib and
idlelib.pyshell) and as a local twice (in mhlib and pyassemb).

However, i can't think of many instances where there would be
ambiguity in usage -- could 'begin' be a pseudo-identifier like "as"
for some transitional time?

-- adam

[1] (Or maybe GMail's search has failed me ;)

From michele.simionato at gmail.com  Wed May  4 18:49:17 2005
From: michele.simionato at gmail.com (Michele Simionato)
Date: Wed, 4 May 2005 12:49:17 -0400
Subject: [Python-Dev] my first post: asking about a "decorator" module
Message-ID: <4edc17eb050504094950154ed0@mail.gmail.com>

My first post to python-dev, but I guess my name is not completely unknown
in this list ;)

Actually, I have been wondering about subscribing to python-dev for at least 
a couple of years, but never did it, because of the limited amount of time 
I have to follow all the interesting mailing lists in the world.

However, in the last few months I have been involved with teaching Python 
and I have decided to follow more closely the development to keep myself
updated on what is going on.

Plus, I have some ideas I would like to share with people in this list.

One of them concerns decorators.

Are there plans to improve decorators support in future Python versions?
By "improving decorator support" I mean for instance a module in the standard
library providing some commonly used decorators such as ``memoize``,
or utilities to create and compose decorators, and things like that.

I have been doing some work on decorators lately and I would be
willing to help is there is a general interest about a "decorator"
module. Actually, I have already a good candidate function for that module,
and plenty of recipes.

I submitted an early version of the idea some time ago on c.l.py

http://groups-beta.google.com/group/comp.lang.python/browse_frm/thread/60f22ed33af5dbcb/5f870d271456ccf3?q=simionato+decorate&rnum=1&hl=en#5f870d271456ccf3

but I could as well flesh it out and deliver a module people can
play with and see if they like it. This is especially interesting in this
moment, since decorators may address many of the use cases 
of PEP 340 (not all of them).

I need to write down some documentation, but it could be done by tomorrow.

What do people think?


           Michele Simionato

From reinhold-birkenfeld-nospam at wolke7.net  Wed May  4 18:57:58 2005
From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld)
Date: Wed, 04 May 2005 18:57:58 +0200
Subject: [Python-Dev] "begin" as keyword for pep 340
In-Reply-To: <d7cee4a7050504094122979f5a@mail.gmail.com>
References: <d7cee4a7050504094122979f5a@mail.gmail.com>
Message-ID: <d5aul0$l90$1@sea.gmane.org>

Adam Souzis wrote:
> I'm a bit surpised that no one has yet [1] suggested "begin" as a
> keyword instead "block" as it seems to express the intent of blocks
> and is concise and readable.  For example, here are the examples in
> PEP 340 rewritten using "begin":
> 
> begin locking():
>    ...

I don't know, but I always would expect "end" to follow each begin
somewhere...

the-good-old-pascal-days-ly yours,
Reinhold

PS: What about "using"? Too C#-ish?

-- 
Mail address is perfectly valid!


From mwh at python.net  Wed May  4 19:02:20 2005
From: mwh at python.net (Michael Hudson)
Date: Wed, 04 May 2005 18:02:20 +0100
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com> (Nicholas Bastin's
	message of "Wed, 4 May 2005 11:59:40 -0400")
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>
Message-ID: <2mhdhiyler.fsf@starship.python.net>

Nicholas Bastin <nbastin at opnet.com> writes:

> The current documentation for Py_UNICODE states:
>
> "This type represents a 16-bit unsigned storage type which is used by  
> Python internally as basis for holding Unicode ordinals. On  platforms 
> where wchar_t is available and also has 16-bits,  Py_UNICODE is a 
> typedef alias for wchar_t to enhance  native platform compatibility. On 
> all other platforms,  Py_UNICODE is a typedef alias for unsigned 
> short."
>
> I propose changing this to:
>
> "This type represents the storage type which is used by Python 
> internally as the basis for holding Unicode ordinals.  On platforms 
> where wchar_t is available, Py_UNICODE is a typedef alias for wchar_t 
> to enhance native platform compatibility.

This just isn't true.  Have you read ./configure --help recently?

> On all other platforms, Py_UNICODE is a typedef alias for unsigned
> short.  Extension module developers should make no assumptions about
> the size of this type on any given platform."

I like this last sentence, though.

> If no one has a problem with that, I'll make the change in CVS.

I have a problem with replacing one lie with another :)

Cheers,
mwh

-- 
  Just put the user directories on a 486 with deadrat7.1 and turn the
  Octane into the afforementioned beer fridge and keep it in your
  office. The lusers won't notice the difference, except that you're
  more cheery during office hours.              -- Pim van Riezen, asr

From gvanrossum at gmail.com  Wed May  4 19:09:43 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Wed, 4 May 2005 10:09:43 -0700
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <d59vll$4qf$1@sea.gmane.org>
References: <d59vll$4qf$1@sea.gmane.org>
Message-ID: <ca471dc205050410097355aa80@mail.gmail.com>

Nice one. Should be a piece of cake to implement. Please talk to
peps at python.org about getting it checked into the PEP database.

I'm +1 on accepting this now -- anybody against?

On 5/4/05, Reinhold Birkenfeld <reinhold-birkenfeld-nospam at wolke7.net> wrote:
> Hello,
> 
> after proposing this here (albeit deep in the PEP 340 thread) and
> getting a somewhat affirmatory response from Guido, I have written
> something that could become a PEP if sufficiently hatched...
> 
> ---------------------------
> 
> PEP: XXX
> Title: Unifying try-except and try-finally
> Version: $Revision: $
> Last-Modified: $Date: $
> Author: Reinhold Birkenfeld <reinhold-birkenfeld-nospam at wolke7.net>
> Status: Draft
> Type: Standards Track
> Content-Type: text/plain
> Created: 04-May-2005
> Post-History:
> 
> Abstract
> 
>     This PEP proposes a change in the syntax and semantics of try
>     statements to allow combined try-except-finally blocks. This
>     means in short that it would be valid to write
> 
>         try:
>             <do something>
>         except Exception:
>             <handle the error>
>         finally:
>             <cleanup>
> 
> Rationale/Proposal
> 
>     There are many use cases for the try-except statement and
>     for the try-finally statement per se; however, often one needs
>     to catch exceptions and execute some cleanup code afterwards.
>     It is slightly annoying and not very intelligible that
>     one has to write
> 
>         f = None
>         try:
>             try:
>                 f = open(filename)
>                 text = f.read()
>             except IOError:
>                 print 'An error occured'
>         finally:
>             if f:
>                 f.close()
> 
>     So it is proposed that a construction like this
> 
>         try:
>             <suite 1>
>         except Ex1:
>             <suite 2>
>         <more except: clauses>
>         else:
>             <suite 3>
>         finally:
>             <suite 4>
> 
>     be exactly the same as the legacy
> 
>         try:
>             try:
>                 <suite 1>
>             except Ex1:
>                 <suite 2>
>             <more except: clauses>
>             else:
>                 <suite 3>
>         finally:
>             <suite 4>
> 
>     This is backwards compatible, and every try statement that is
>     legal today would continue to work.
> 
> Changes to the grammar
> 
>     The grammar for the try statement, which is currently
> 
>         try_stmt: ('try' ':' suite (except_clause ':' suite)+
>                    ['else' ':' suite] | 'try' ':' suite 'finally' ':' suite)
> 
>     would have to become
> 
>         try_stmt: ('try' ':' suite (except_clause ':' suite)+
>                    ['else' ':' suite] ['finally' ':' suite] |
>                    'try' ':' suite (except_clause ':' suite)*
>                    ['else' ':' suite] 'finally' ':' suite)
> 
> Implementation
> 
>     As the PEP author currently does not have sufficient knowledge
>     of the CPython implementation, he is unfortunately not able
>     to deliver one.
> 
> References
> 
>     None yet.
> 
> Copyright
> 
>     This document has been placed in the public domain.
> 
> -----------------------
> Reinhold
> 
> --
> Mail address is perfectly valid!
> 
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: http://mail.python.org/mailman/options/python-dev/guido%40python.org
> 


-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gjc at inescporto.pt  Wed May  4 19:14:01 2005
From: gjc at inescporto.pt (Gustavo J. A. M. Carneiro)
Date: Wed, 04 May 2005 18:14:01 +0100
Subject: [Python-Dev] PEP 340: propose to get rid of 'as' keyword
Message-ID: <1115226841.7909.24.camel@localhost>

  I have not read every email about this subject, so sorry if this has
already been mentioned.

  In PEP 340 I read:

        block EXPR1 as VAR1:
            BLOCK1

  I think it would be much clearer this (plus you save one keyword):

        block VAR1 = EXPR1:
            BLOCK1

  Regards.

-- 
Gustavo J. A. M. Carneiro
<gjc at inescporto.pt> <gustavo at users.sourceforge.net>
The universe is always one step beyond logic.


From nbastin at opnet.com  Wed May  4 19:19:34 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Wed, 4 May 2005 13:19:34 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <2mhdhiyler.fsf@starship.python.net>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>
	<2mhdhiyler.fsf@starship.python.net>
Message-ID: <19b53d82c6eb11419ddf4cb529241f64@opnet.com>


On May 4, 2005, at 1:02 PM, Michael Hudson wrote:

> Nicholas Bastin <nbastin at opnet.com> writes:
>
>> The current documentation for Py_UNICODE states:
>>
>> "This type represents a 16-bit unsigned storage type which is used by
>> Python internally as basis for holding Unicode ordinals. On  platforms
>> where wchar_t is available and also has 16-bits,  Py_UNICODE is a
>> typedef alias for wchar_t to enhance  native platform compatibility. 
>> On
>> all other platforms,  Py_UNICODE is a typedef alias for unsigned
>> short."
>>
>> I propose changing this to:
>>
>> "This type represents the storage type which is used by Python
>> internally as the basis for holding Unicode ordinals.  On platforms
>> where wchar_t is available, Py_UNICODE is a typedef alias for wchar_t
>> to enhance native platform compatibility.
>
> This just isn't true.  Have you read ./configure --help recently?

Ok, so the above statement is true if the user does not set 
--enable-unicode=ucs[24] (was reading the whar_t test in configure.in, 
and not the generated configure help).

Alternatively, we shouldn't talk about the size at all, and just leave 
the first and last sentences:

"This type represents the storage type which is used by Python 
internally as the basis for holding Unicode ordinals.  Extension module 
developers should make no assumptions about the size of this type on 
any given platform."

--
Nick


From rodsenra at gpr.com.br  Wed May  4 19:30:21 2005
From: rodsenra at gpr.com.br (Rodrigo Dias Arruda Senra)
Date: Wed, 04 May 2005 17:30:21 -0000
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <ca471dc2050503132010abb4df@mail.gmail.com>
References: <ca471dc20505021755518773c8@mail.gmail.com>
	<20050503201400.GE30548@solar.trillke.net>
	<ca471dc2050503132010abb4df@mail.gmail.com>
Message-ID: <20020107054513.566d74ed@localhost.localdomain>


 [ Guido ]:
 > 1. Decide on a keyword to use, if any.

 Shouldn't be the other way around ?
 Decide to use *no* keyword, if that could be avoided. 

 In my large inexperience *no keyword* is much better (if feasible):
  
 1) No name conflicts with previous code: block, blocktemplate, whatever
 2) ':' is already a block (broader sense) indication
 3) Improved readbility:

    <<from PEP 340>>
    def locking_opening(lock, filename, mode="r"):
        block locking(lock):
            block opening(filename) as f:
                yield f

    <<from PEP 340>>
    def locking_opening(lock, filename, mode="r"):
        locking(lock):
            opening(filename) as f:
                yield f

 4) Better to make the language parser more complex than the language 
    exposed to end-users

 Following the PEP and this thread, it seems to me that __no keyword__
 is less preferable than __some keyword__(=='block' so far), and I wonder
 why is not the reverse. Perhaps I missed something ?

 Besides, I think this solves many issues AOP was trying to tackle in
 a much cleaner, elegant -- therefore pythonic -- way. Outstanding.

 best regards,
 Senra

-- 
Rodrigo Senra                 
--
MSc Computer Engineer    rodsenra(at)gpr.com.br  
GPr Sistemas Ltda        http://www.gpr.com.br/ 
Personal Blog     http://rodsenra.blogspot.com/


From barry at python.org  Wed May  4 19:36:15 2005
From: barry at python.org (Barry Warsaw)
Date: Wed, 04 May 2005 13:36:15 -0400
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <ca471dc205050410097355aa80@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
Message-ID: <1115228174.21868.18.camel@geddy.wooz.org>

On Wed, 2005-05-04 at 13:09, Guido van Rossum wrote:
> Nice one. Should be a piece of cake to implement. Please talk to
> peps at python.org about getting it checked into the PEP database.

+1!
-Barry


-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 307 bytes
Desc: This is a digitally signed message part
Url : http://mail.python.org/pipermail/python-dev/attachments/20050504/120d6578/attachment.pgp

From rodsenra at gpr.com.br  Wed May  4 20:23:17 2005
From: rodsenra at gpr.com.br (Rodrigo Dias Arruda Senra)
Date: Wed, 04 May 2005 18:23:17 -0000
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <ca471dc205050410097355aa80@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
Message-ID: <20020107063813.0e603043@localhost.localdomain>

[ Guido ]:
> Nice one. Should be a piece of cake to implement. Please talk to
> peps at python.org about getting it checked into the PEP database.
> 
> I'm +1 on accepting this now -- anybody against?

+1

Last week, while I was giving a Python course (in Rio de Janeiro-Brazil)
some students attempted to use try/except/finally blocks. I had to
dig the grammar to prove to them that it was __not already__ supported.

cheers,
Senra

-- 
Rodrigo Senra                 
--
MSc Computer Engineer    rodsenra(at)gpr.com.br  
GPr Sistemas Ltda        http://www.gpr.com.br/ 
Personal Blog     http://rodsenra.blogspot.com/


From python at rcn.com  Wed May  4 20:26:49 2005
From: python at rcn.com (Raymond Hettinger)
Date: Wed, 4 May 2005 14:26:49 -0400
Subject: [Python-Dev] my first post: asking about a "decorator" module
In-Reply-To: <4edc17eb050504094950154ed0@mail.gmail.com>
Message-ID: <002401c550d6$d669ab80$11bd2c81@oemcomputer>

> Are there plans to improve decorators support in future Python
versions?
> By "improving decorator support" I mean for instance a module in the
> standard
> library providing some commonly used decorators such as ``memoize``,
> or utilities to create and compose decorators, and things like that.

Ultimately, some of these will likely end-up in the library.  For the
time being, I think it best that these get posted and evolve either as
Wiki entries or as ASPN entries.  The best practices and proven winners
have yet to emerge.  Solidifying first attempts is likely not a good
idea.  Putting tools in the standard library should be the last
evolutionary step, not the first.


Raymond Hettinger

From jwp at localhost.lit.jwp.name  Wed May  4 19:52:53 2005
From: jwp at localhost.lit.jwp.name (James William Pye)
Date: Wed, 04 May 2005 10:52:53 -0700
Subject: [Python-Dev] Need to hook Py_FatalError
In-Reply-To: <Xns964CBEA1EFA75token@80.91.229.5>
References: <20050503112637.648F.JCARLSON@uci.edu>
	<Xns964BE20F692E9token@80.91.229.5>
	<20050503132639.6492.JCARLSON@uci.edu>
	<1115171686.62180.48.camel@localhost>
	<Xns964C846BCF835token@80.91.229.5>
	<1115218734.62180.119.camel@localhost>
	<Xns964CBEA1EFA75token@80.91.229.5>
Message-ID: <1115229173.62180.171.camel@localhost>

On Wed, 2005-05-04 at 15:29 +0000, M.Utku K. wrote:
> Extension developer may discard or call them at his own will.

That's the issue, an extension developer shouldn't be able to discard
it, as I, the embedder, do not want my hook to be clobbered. The
extension developer doesn't set the context of the application, the
embedder does.

> What if more than one extension needs it ?

Firstly, I don't think it is likely an extension module *by itself*
would ever have to initialize something that would *require* some form
of cleanup if the app were to fatal out. If it did, I imagine that it
would be suspect of poor design, any exceptions likely to be few and far
between. Now, that doesn't mean its use during the process might not
create some state or side effect where cleanup would be nice. Although,
chances are that such cleanup should occur during normal operations, and
be handled via a Python exception, something that a fatal is not.
-- 
Regards, James William Pye
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 187 bytes
Desc: This is a digitally signed message part
Url : http://mail.python.org/pipermail/python-dev/attachments/20050504/60abdfb4/attachment.pgp

From fredrik at pythonware.com  Wed May  4 20:29:18 2005
From: fredrik at pythonware.com (Fredrik Lundh)
Date: Wed, 4 May 2005 20:29:18 +0200
Subject: [Python-Dev] PEP 340: propose to get rid of 'as' keyword
References: <1115226841.7909.24.camel@localhost>
Message-ID: <d5b45s$ee8$2@sea.gmane.org>

Gustavo J. A. M. Carneiro wrote:

>   I have not read every email about this subject, so sorry if this has
> already been mentioned.
>
>   In PEP 340 I read:
>
>         block EXPR1 as VAR1:
>             BLOCK1
>
>   I think it would be much clearer this (plus you save one keyword):
>
>         block VAR1 = EXPR1:
>             BLOCK1

clearer for whom?   where else is this construct used in Python?

</F>




From fredrik at pythonware.com  Wed May  4 20:33:10 2005
From: fredrik at pythonware.com (Fredrik Lundh)
Date: Wed, 4 May 2005 20:33:10 +0200
Subject: [Python-Dev] New Py_UNICODE doc
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>
	<4qdjm0rt.fsf@python.net>
Message-ID: <d5b45s$ee8$3@sea.gmane.org>

Thomas Heller wrote:

> AFAIK, you can configure Python to use 16-bits or 32-bits Unicode chars,
> independend from the size of wchar_t.  The HAVE_USABLE_WCHAR_T macro
> can be used by extension writers to determine if Py_UNICODE is the same as
> wchar_t.

note that "usable" is more than just "same size"; it also implies that widechar
predicates (iswalnum etc) works properly with Unicode characters, under all
locales.

</F>




From reinhold-birkenfeld-nospam at wolke7.net  Wed May  4 20:36:34 2005
From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld)
Date: Wed, 04 May 2005 20:36:34 +0200
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <20020107054513.566d74ed@localhost.localdomain>
References: <ca471dc20505021755518773c8@mail.gmail.com>	<20050503201400.GE30548@solar.trillke.net>	<ca471dc2050503132010abb4df@mail.gmail.com>
	<20020107054513.566d74ed@localhost.localdomain>
Message-ID: <d5b4dr$fa8$1@sea.gmane.org>

Rodrigo Dias Arruda Senra wrote:
>  [ Guido ]:
>  > 1. Decide on a keyword to use, if any.
> 
>  Shouldn't be the other way around ?
>  Decide to use *no* keyword, if that could be avoided. 

[...]
>  Following the PEP and this thread, it seems to me that __no keyword__
>  is less preferable than __some keyword__(=='block' so far), and I wonder
>  why is not the reverse. Perhaps I missed something ?

There is one problem with using no keyword: You cannot use arbitrary expressions
in the new statement. Consider:

resource = opening("file.txt")
block resource:
    (...)

resource = opening("file.txt")
resource:
    (...)

The latter would have to be forbidden.

(Seeing these examples, I somehow strongly dislike "block";
"with" or "using" seem really better)

Reinhold

-- 
Mail address is perfectly valid!


From tim.peters at gmail.com  Wed May  4 20:41:22 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Wed, 4 May 2005 14:41:22 -0400
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <ca471dc205050410097355aa80@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
Message-ID: <1f7befae05050411416c198c54@mail.gmail.com>

[Guido]
> I'm +1 on accepting this now -- anybody against?

I'm curious to know if you (Guido) remember why you removed this
feature in Python 0.9.6?  From the HISTORY file:

"""
New features in 0.9.6:
- stricter try stmt syntax: cannot mix except and finally clauses on 1 try
"""

IIRC (and I may well not), half of people guessed wrong about whether
an exception raised in an "except:" suite would or would not skip
execution of the same-level "finally:" suite.

try:
    1/0
except DivisionByZero:
    2/0
finally:
    print "yes or no?"

The complementary question is whether an exception in the "finally:"
suite will be handled by the same-level "except:" suites.

There are obvious answers to both, of course.  The question is whether
they're the _same_ obvious answers across responders <0.7 wink>.

From bob at redivi.com  Wed May  4 20:45:25 2005
From: bob at redivi.com (Bob Ippolito)
Date: Wed, 4 May 2005 14:45:25 -0400
Subject: [Python-Dev] PEP 340: propose to get rid of 'as' keyword
In-Reply-To: <d5b45s$ee8$2@sea.gmane.org>
References: <1115226841.7909.24.camel@localhost> <d5b45s$ee8$2@sea.gmane.org>
Message-ID: <2655785A-547C-4832-AC3F-A9271DB61A06@redivi.com>


On May 4, 2005, at 2:29 PM, Fredrik Lundh wrote:

> Gustavo J. A. M. Carneiro wrote:
>
>
>>   I have not read every email about this subject, so sorry if this  
>> has
>> already been mentioned.
>>
>>   In PEP 340 I read:
>>
>>         block EXPR1 as VAR1:
>>             BLOCK1
>>
>>   I think it would be much clearer this (plus you save one keyword):
>>
>>         block VAR1 = EXPR1:
>>             BLOCK1
>>
>
> clearer for whom?   where else is this construct used in Python?

It might be more clear to have the "var" on the left.  The only place  
it's used on the right (that I know of) is in import statements when  
using the "as" clause.  Assignment, for loops, generator expressions,  
list comprehensions, etc. always have the var on the left.

-bob


From noamraph at gmail.com  Wed May  4 20:57:42 2005
From: noamraph at gmail.com (Noam Raphael)
Date: Wed, 4 May 2005 20:57:42 +0200
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <d5b4dr$fa8$1@sea.gmane.org>
References: <ca471dc20505021755518773c8@mail.gmail.com>
	<20050503201400.GE30548@solar.trillke.net>
	<ca471dc2050503132010abb4df@mail.gmail.com>
	<20020107054513.566d74ed@localhost.localdomain>
	<d5b4dr$fa8$1@sea.gmane.org>
Message-ID: <b348a0850505041157dfb3659@mail.gmail.com>

On 5/4/05, Reinhold Birkenfeld <reinhold-birkenfeld-nospam at wolke7.net> wrote:
> 
> There is one problem with using no keyword: You cannot use arbitrary expressions
> in the new statement. Consider:
> 
> resource = opening("file.txt")
> block resource:
>     (...)
> 
> resource = opening("file.txt")
> resource:
>     (...)
> 
> The latter would have to be forbidden.

Can you explain why it would have to be forbidden please?

Thanks,
Noam

From shane at hathawaymix.org  Wed May  4 21:02:40 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Wed, 04 May 2005 13:02:40 -0600
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
References: <20050503150510.GA13595@onegeek.org>
	<427797D5.8030207@cirad.fr>	<17015.39213.522060.873605@montanaro.dyndns.org>	<ca471dc20505031013287a2e92@mail.gmail.com>	<17015.48830.223391.390538@montanaro.dyndns.org>	<ca471dc2050503113127f938b0@mail.gmail.com>	<d58tkb$fvp$1@sea.gmane.org>	<79990c6b050504015762d004ac@mail.gmail.com>
	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
Message-ID: <42791C50.3090107@hathawaymix.org>

Alex Martelli wrote:
> Looking for a file with a certain magicnumber in its 1st two bytes...?
> 
> for name in filenames:
>      opening(name) as f:
>          if f.read(2) == 0xFEB0: break
> 
> This does seem to make real-life sense to me...

I'd like to suggest a small language enhancement that would fix this
example.  Allow the break and continue statements to use a keyword,
either "for" or "while", to state that the code should break out of both
the block statement and the innermost "for" or "while" statement.  The
example above would change to:

    for name in filenames:
        opening(name) as f:
            if f.read(2) == 0xFEB0:
                break for

This could be a separate PEP if necessary.  When a "break for" is used
in a block statement, it should raise a new kind of exception,
BreakForLoop, and the block statement should propagate the exception.
When used outside a block statement, "break for" can use existing Python
byte code to jump directly to the next appropriate statement.

Shane

From theller at python.net  Wed May  4 20:59:24 2005
From: theller at python.net (Thomas Heller)
Date: Wed, 04 May 2005 20:59:24 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <d5b45s$ee8$3@sea.gmane.org> (Fredrik Lundh's message of "Wed,
	4 May 2005 20:33:10 +0200")
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>
	<4qdjm0rt.fsf@python.net> <d5b45s$ee8$3@sea.gmane.org>
Message-ID: <mzraddgz.fsf@python.net>

"Fredrik Lundh" <fredrik at pythonware.com> writes:

> Thomas Heller wrote:
>
>> AFAIK, you can configure Python to use 16-bits or 32-bits Unicode chars,
>> independend from the size of wchar_t.  The HAVE_USABLE_WCHAR_T macro
>> can be used by extension writers to determine if Py_UNICODE is the same as
>> wchar_t.
>
> note that "usable" is more than just "same size"; it also implies that widechar
> predicates (iswalnum etc) works properly with Unicode characters, under all
> locales.

Ok, so who is going to collect the wisdom of this thread into the docs?

Thomas


From m.u.k.2 at gawab.com  Wed May  4 20:52:51 2005
From: m.u.k.2 at gawab.com (M.Utku K.)
Date: Wed, 4 May 2005 18:52:51 +0000 (UTC)
Subject: [Python-Dev] Need to hook Py_FatalError
References: <20050503112637.648F.JCARLSON@uci.edu>
	<Xns964BE20F692E9token@80.91.229.5>
	<20050503132639.6492.JCARLSON@uci.edu>
	<1115171686.62180.48.camel@localhost>
	<Xns964C846BCF835token@80.91.229.5>
	<1115218734.62180.119.camel@localhost>
	<Xns964CBEA1EFA75token@80.91.229.5>
Message-ID: <Xns964CE11B16061token@80.91.229.5>

Hi all,

> strip.....
> What if more than one extension needs it ?
> Curently Im doing
> 
> callback_type SetCallBack(callback_type newfunc)
> 
> This will set the callback to newfunc and return the old one. Extension 
> developer may discard or call them at his own will. What do you think?
> 
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> http://mail.python.org/mailman/options/python-dev/python-python-dev%40m.g
> mane.org 
> 


Ok then, it will be a one shot callback registration.


By the way declaration of the func
(  void SetFatalError_Callback(PyFatalError_Func func)  )
is in "pyerrors.h" but implemenatiton is
is in "Pythonrun.c". Is it OK? Im listening for more. 

Best regards.



From shane at hathawaymix.org  Wed May  4 21:08:59 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Wed, 04 May 2005 13:08:59 -0600
Subject: [Python-Dev] PEP 340: propose to get rid of 'as' keyword
In-Reply-To: <1115226841.7909.24.camel@localhost>
References: <1115226841.7909.24.camel@localhost>
Message-ID: <42791DCB.5050703@hathawaymix.org>

Gustavo J. A. M. Carneiro wrote:
>   In PEP 340 I read:
> 
>         block EXPR1 as VAR1:
>             BLOCK1
> 
>   I think it would be much clearer this (plus you save one keyword):
> 
>         block VAR1 = EXPR1:
>             BLOCK1

I think you misunderstood the statement.  EXPR1 creates an iterator,
then VAR1 iterates over the values returns by the iterator.  VAR1 never
sees the iterator.  Using your syntax would reinforce the
misinterpretation that VAR1 sees the iterator.

Shane

From rodsenra at gpr.com.br  Wed May  4 21:12:33 2005
From: rodsenra at gpr.com.br (Rodrigo Dias Arruda Senra)
Date: Wed, 4 May 2005 16:12:33 -0300
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <d5b4dr$fa8$1@sea.gmane.org>
References: <ca471dc20505021755518773c8@mail.gmail.com>
	<20050503201400.GE30548@solar.trillke.net>
	<ca471dc2050503132010abb4df@mail.gmail.com>
	<20020107054513.566d74ed@localhost.localdomain>
	<d5b4dr$fa8$1@sea.gmane.org>
Message-ID: <20050504161233.541b3fc6@localhost.localdomain>

[ Senra ]:
> >  [ Guido ]:
> >  > 1. Decide on a keyword to use, if any.
> > 
> >  Shouldn't be the other way around ?
> >  Decide to use *no* keyword, if that could be avoided. 

[ Reinhold ]:
> There is one problem with using no keyword: You cannot use arbitrary expressions
> in the new statement. Consider:
> 
> resource = opening("file.txt")
> block resource:
>     (...)
> 
> resource = opening("file.txt")
> resource:
>     (...)
> 
> The latter would have to be forbidden.


I'm not quite sure why, but there seem to be a 
workaround (forseen in PEP 340). And people seem
to be "using" this already <0.5 wink>:

 [Alex Martelli]:
 > for name in filenames:
 >      opening(name) as f:
 >          if f.read(2) == 0xFEB0: break

Moreover, an anonymous block should have
no <<name>> (neither 'block', 'with', 'using')
to be true anonymous <1.0-Tim-Peter'ly wink>

cheers,
Senra

-- 
Rodrigo Senra                 
--
MSc Computer Engineer    rodsenra(at)gpr.com.br  
GPr Sistemas Ltda        http://www.gpr.com.br/ 
Personal Blog     http://rodsenra.blogspot.com/


From shane at hathawaymix.org  Wed May  4 21:14:03 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Wed, 04 May 2005 13:14:03 -0600
Subject: [Python-Dev] PEP 340: propose to get rid of 'as' keyword
In-Reply-To: <42791DCB.5050703@hathawaymix.org>
References: <1115226841.7909.24.camel@localhost>
	<42791DCB.5050703@hathawaymix.org>
Message-ID: <42791EFB.7090200@hathawaymix.org>

Shane Hathaway wrote:
> Gustavo J. A. M. Carneiro wrote:
> 
>>  In PEP 340 I read:
>>
>>        block EXPR1 as VAR1:
>>            BLOCK1
>>
>>  I think it would be much clearer this (plus you save one keyword):
>>
>>        block VAR1 = EXPR1:
>>            BLOCK1
> 
> 
> I think you misunderstood the statement.  EXPR1 creates an iterator,
> then VAR1 iterates over the values returns by the iterator.  VAR1 never
                                     ^^^^^^^^^^
                                     returned by
> sees the iterator.  Using your syntax would reinforce the
> misinterpretation that VAR1 sees the iterator.

From mitja.marn at gmail.com  Wed May  4 21:10:34 2005
From: mitja.marn at gmail.com (Mitja Marn)
Date: Wed, 4 May 2005 21:10:34 +0200
Subject: [Python-Dev] "begin" as keyword for pep 340
In-Reply-To: <d5aul0$l90$1@sea.gmane.org>
References: <d7cee4a7050504094122979f5a@mail.gmail.com>
	<d5aul0$l90$1@sea.gmane.org>
Message-ID: <bc985b0005050412102fea87a6@mail.gmail.com>

On 5/4/05, Reinhold Birkenfeld <reinhold-birkenfeld-nospam at wolke7.net> wrote:
> PS: What about "using"? Too C#-ish?


Another idea from a hobbyist programmer: "holding" or mabe just
"hold". Like this:

hold locked(myLock):
            # Code here executes with myLock held.  The lock is
            # guaranteed to be released when the block is left (even
            # if via return or by an uncaught exception).

hold opened("/etc/passwd") as f:
            for line in f:
                print line.rstrip()

From m.u.k.2 at gawab.com  Wed May  4 21:05:35 2005
From: m.u.k.2 at gawab.com (M.Utku K.)
Date: Wed, 4 May 2005 19:05:35 +0000 (UTC)
Subject: [Python-Dev] Need to hook Py_FatalError
References: <20050503112637.648F.JCARLSON@uci.edu>
	<Xns964BE20F692E9token@80.91.229.5>
	<20050503132639.6492.JCARLSON@uci.edu>
	<1115171686.62180.48.camel@localhost>
	<Xns964C846BCF835token@80.91.229.5>
	<1115218734.62180.119.camel@localhost>
	<Xns964CBEA1EFA75token@80.91.229.5>
	<Xns964CE11B16061token@80.91.229.5>
Message-ID: <Xns964CE343D4B6Ftoken@80.91.229.5>

"M.Utku K." <m.u.k.2 at gawab.com> wrote in news:Xns964CE11B16061token@
80.91.229.5:

> _Callback(PyFatalError_Func func)  )
> is in "pyerrors.h" but implemenatiton is
> is in "Pythonrun.c". Is it OK? Im listening for more. 
> 

Sorry, just checked decl. will be in "pydebug.h"


From skip at pobox.com  Wed May  4 21:19:25 2005
From: skip at pobox.com (Skip Montanaro)
Date: Wed, 4 May 2005 14:19:25 -0500
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <ca471dc205050410097355aa80@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
Message-ID: <17017.8253.375246.734512@montanaro.dyndns.org>


    Guido> Nice one. Should be a piece of cake to implement. Please talk to
    Guido> peps at python.org about getting it checked into the PEP database.

    Guido> I'm +1 on accepting this now -- anybody against?

I'm not against it, but I thought there were ambiguity reasons that this
construct wasn't already implemented.  I'm pretty sure people have asked
about it before but been rebuffed.

Here's a message with Reinhold's fingerprints on it:

    http://mail.python.org/pipermail/python-list/2004-June/227008.html

Here's another one:

    http://mail.python.org/pipermail/python-list/2003-November/193159.html

Both reference other articles which presumably have more details about the
reasoning, but I was unable to find them with a quick search.

Skip

From python-dev at zesty.ca  Wed May  4 21:20:09 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Wed, 4 May 2005 14:20:09 -0500 (CDT)
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <42791C50.3090107@hathawaymix.org>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
	<42791C50.3090107@hathawaymix.org>
Message-ID: <Pine.LNX.4.58.0505041419510.4786@server1.LFW.org>

On Wed, 4 May 2005, Shane Hathaway wrote:
> I'd like to suggest a small language enhancement that would fix this
> example.  Allow the break and continue statements to use a keyword,
> either "for" or "while", to state that the code should break out of both
> the block statement and the innermost "for" or "while" statement.  The
> example above would change to:
>
>     for name in filenames:
>         opening(name) as f:
>             if f.read(2) == 0xFEB0:
>                 break for

This is very elegant.  It works beautifully with "break", though at
first that natural analogs "continue for", "continue while" appear to
conflict with Guido's proposed extension to "continue".

But if we choose the keyword "with" to introduce an anonymous block,
it comes out rather nicely:

    continue with 2

That's easier to read than "continue 2", in my opinion.  (If it's not
too cute for you.)

Anyway, i like the general idea of letting the programmer specify
exactly which block to break/continue, instead of leaving it looking
ambiguous.  Explicit is better than implicit, right?


-- ?!ng

From rodsenra at gpr.com.br  Wed May  4 21:24:10 2005
From: rodsenra at gpr.com.br (Rodrigo Dias Arruda Senra)
Date: Wed, 4 May 2005 16:24:10 -0300
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <42791C50.3090107@hathawaymix.org>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
	<42791C50.3090107@hathawaymix.org>
Message-ID: <20050504162410.7e7fcd14@localhost.localdomain>

[ Shane Hathaway ]:
> I'd like to suggest a small language enhancement that would fix this
> example.  Allow the break and continue statements to use a keyword,
> either "for" or "while", to state that the code should break out of both
> the block statement and the innermost "for" or "while" statement.  The
> example above would change to:
> 
>     for name in filenames:
>         opening(name) as f:
>             if f.read(2) == 0xFEB0:
>                 break for
> 
> This could be a separate PEP if necessary.  When a "break for" is used
> in a block statement, it should raise a new kind of exception,
> BreakForLoop, and the block statement should propagate the exception.
> When used outside a block statement, "break for" can use existing Python
> byte code to jump directly to the next appropriate statement.

 What about nested blocks ? When they act as iterators that would be
 desireable too.

 What to do then: 

  - baptize blocks  -> break <name>
  - keep them anonymous ->  break #enclosing_scope_counter
  - do not support them 

 cheers,
 Senra

-- 
Rodrigo Senra                 
--
MSc Computer Engineer    rodsenra(at)gpr.com.br  
GPr Sistemas Ltda        http://www.gpr.com.br/ 
Personal Blog     http://rodsenra.blogspot.com/


From barry at python.org  Wed May  4 21:20:31 2005
From: barry at python.org (Barry Warsaw)
Date: Wed, 04 May 2005 15:20:31 -0400
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <1f7befae05050411416c198c54@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
	<1f7befae05050411416c198c54@mail.gmail.com>
Message-ID: <1115234431.21863.23.camel@geddy.wooz.org>

On Wed, 2005-05-04 at 14:41, Tim Peters wrote:

> IIRC (and I may well not), half of people guessed wrong about whether
> an exception raised in an "except:" suite would or would not skip
> execution of the same-level "finally:" suite.

It would not, obviously <wink>.

> try:
>     1/0
> except DivisionByZero:
>     2/0
> finally:
>     print "yes or no?"
> 
> The complementary question is whether an exception in the "finally:"
> suite will be handled by the same-level "except:" suites.

It would not, obviously <wink>.

> There are obvious answers to both, of course.  The question is whether
> they're the _same_ obvious answers across responders <0.7 wink>.

It only matters that it's the same obvious answers across all responders
who are right. :)

-Barry

-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 307 bytes
Desc: This is a digitally signed message part
Url : http://mail.python.org/pipermail/python-dev/attachments/20050504/8599b268/attachment.pgp

From shane at hathawaymix.org  Wed May  4 21:31:23 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Wed, 04 May 2005 13:31:23 -0600
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <Pine.LNX.4.58.0505041402220.4786@server1.LFW.org>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
	<42791C50.3090107@hathawaymix.org>
	<Pine.LNX.4.58.0505041402220.4786@server1.LFW.org>
Message-ID: <4279230B.4050902@hathawaymix.org>

Ka-Ping Yee wrote:
> On Wed, 4 May 2005, Shane Hathaway wrote:
>>
>>    for name in filenames:
>>        opening(name) as f:
>>            if f.read(2) == 0xFEB0:
>>                break for
> 
> 
> This is very elegant.

Thanks.

>  It works beautifully with "break", though at
> first that natural analogs "continue for", "continue while" appear to
> conflict with Guido's proposed extension to "continue".
> 
> But if we choose the keyword "with" to introduce an anonymous block,
> it comes out rather nicely:
> 
>     continue with 2
> 
> That's easier to read than "continue 2", in my opinion.  (If it's not
> too cute for you.)

Or perhaps:

    continue yield 2

This would create some symmetry, since generators will retrieve the
value passed by a continue statement using a yield expression.

Shane

From gvanrossum at gmail.com  Wed May  4 21:27:03 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Wed, 4 May 2005 12:27:03 -0700
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <1f7befae05050411416c198c54@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
	<1f7befae05050411416c198c54@mail.gmail.com>
Message-ID: <ca471dc2050504122748f894d8@mail.gmail.com>

[Tim]
> I'm curious to know if you (Guido) remember why you removed this
> feature in Python 0.9.6?  From the HISTORY file:
> 
> """
> New features in 0.9.6:
> - stricter try stmt syntax: cannot mix except and finally clauses on 1 try
> """
> 
> IIRC (and I may well not), half of people guessed wrong about whether
> an exception raised in an "except:" suite would or would not skip
> execution of the same-level "finally:" suite.
> 
> try:
>     1/0
> except DivisionByZero:
>     2/0
> finally:
>     print "yes or no?"
> 
> The complementary question is whether an exception in the "finally:"
> suite will be handled by the same-level "except:" suites.

No. The rule of thumb is that control only passes forward.

> There are obvious answers to both, of course.  The question is whether
> they're the _same_ obvious answers across responders <0.7 wink>.

I think the main person confused was me. :-)

In addition, at the time I don't think I knew Java -- certainly I
didn't know it well enough to realize that it gives this construct the
meaning proposed by Reinhold's PEP.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From reinhold-birkenfeld-nospam at wolke7.net  Wed May  4 21:28:23 2005
From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld)
Date: Wed, 04 May 2005 21:28:23 +0200
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <b348a0850505041157dfb3659@mail.gmail.com>
References: <ca471dc20505021755518773c8@mail.gmail.com>	<20050503201400.GE30548@solar.trillke.net>	<ca471dc2050503132010abb4df@mail.gmail.com>	<20020107054513.566d74ed@localhost.localdomain>	<d5b4dr$fa8$1@sea.gmane.org>
	<b348a0850505041157dfb3659@mail.gmail.com>
Message-ID: <d5b7f0$spf$1@sea.gmane.org>

Noam Raphael wrote:
> On 5/4/05, Reinhold Birkenfeld <reinhold-birkenfeld-nospam at wolke7.net> wrote:
>> 
>> There is one problem with using no keyword: You cannot use arbitrary expressions
>> in the new statement. Consider:
>> 
>> resource = opening("file.txt")
>> block resource:
>>     (...)
>> 
>> resource = opening("file.txt")
>> resource:
>>     (...)
>> 
>> The latter would have to be forbidden.
> 
> Can you explain why it would have to be forbidden please?

Well, with it you could create suites with _any_ introducing
identifier. Consider:

with:
    (...)

synchronized:
    (...)

try:
    (...)

transaction:
    (...)


Do you understand my concern? It would be very, very hard to discern
these "user-defined statements" from real language constructs.

Reinhold

-- 
Mail address is perfectly valid!


From reinhold-birkenfeld-nospam at wolke7.net  Wed May  4 21:33:29 2005
From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld)
Date: Wed, 04 May 2005 21:33:29 +0200
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <1f7befae05050411416c198c54@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>	<ca471dc205050410097355aa80@mail.gmail.com>
	<1f7befae05050411416c198c54@mail.gmail.com>
Message-ID: <d5b7oj$uav$1@sea.gmane.org>

Tim Peters wrote:
> [Guido]
>> I'm +1 on accepting this now -- anybody against?
> 
> I'm curious to know if you (Guido) remember why you removed this
> feature in Python 0.9.6?  From the HISTORY file:
> 
> """
> New features in 0.9.6:
> - stricter try stmt syntax: cannot mix except and finally clauses on 1 try
> """
>
> IIRC (and I may well not), half of people guessed wrong about whether
> an exception raised in an "except:" suite would or would not skip
> execution of the same-level "finally:" suite.

With the arrival of Java and C#, which both have this feature,
I think the wrong guesses are minimized.

I think the behaviour of the "else" clause is much harder to guess,
mainly when used with the looping constructs.

> try:
>     1/0
> except DivisionByZero:
>     2/0
> finally:
>     print "yes or no?"
> 
> The complementary question is whether an exception in the "finally:"
> suite will be handled by the same-level "except:" suites.

No, as except clauses can only occur before the finally clause, and execution
should not go backwards.

> There are obvious answers to both, of course.  The question is whether
> they're the _same_ obvious answers across responders <0.7 wink>.

Reinhold

-- 
Mail address is perfectly valid!


From python-dev at zesty.ca  Wed May  4 21:42:50 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Wed, 4 May 2005 14:42:50 -0500 (CDT)
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <d5b7f0$spf$1@sea.gmane.org>
References: <ca471dc20505021755518773c8@mail.gmail.com>
	<20050503201400.GE30548@solar.trillke.net>
	<ca471dc2050503132010abb4df@mail.gmail.com>
	<20020107054513.566d74ed@localhost.localdomain>
	<d5b4dr$fa8$1@sea.gmane.org>
	<b348a0850505041157dfb3659@mail.gmail.com> <d5b7f0$spf$1@sea.gmane.org>
Message-ID: <Pine.LNX.4.58.0505041437100.4786@server1.LFW.org>

Reinhold Birkenfeld wrote:
> There is one problem with using no keyword: You cannot use arbitrary
> expressions in the new statement.
[...]
> resource = opening("file.txt")
> resource:
>     (...)
>
> The latter would have to be forbidden.

Noam Raphael wrote:
> Can you explain why it would have to be forbidden please?

Reinhold Birkenfeld wrote:
> Well, with it you could create suites with _any_ introducing
> identifier. Consider:
>
> with:
>     (...)
>
> synchronized:
>     (...)
>
> try:
>     (...)
>
> transaction:
>     (...)
>
> Do you understand my concern? It would be very, very hard to discern
> these "user-defined statements" from real language constructs.

I think part of the debate is about whether that's good or bad.
I happen to agree with you -- i think a keyword is necessary --
but i believe some people see an advantage in having the flexibility
to make a "real-looking" construct.

As i see it the argument boils down to: Python is not Lisp.

There are good reasons why the language has keywords, why it
distinguishes statements from expressions, uses indentation, and
so on.  All of these properties cause Python programs to be made
of familiar and easily recognizable patterns instead of degenerating
into a homogeneous pile of syntax.


-- ?!ng

From reinhold-birkenfeld-nospam at wolke7.net  Wed May  4 21:45:36 2005
From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld)
Date: Wed, 04 May 2005 21:45:36 +0200
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <Pine.LNX.4.58.0505041437100.4786@server1.LFW.org>
References: <ca471dc20505021755518773c8@mail.gmail.com>	<20050503201400.GE30548@solar.trillke.net>	<ca471dc2050503132010abb4df@mail.gmail.com>	<20020107054513.566d74ed@localhost.localdomain>	<d5b4dr$fa8$1@sea.gmane.org>	<b348a0850505041157dfb3659@mail.gmail.com>
	<d5b7f0$spf$1@sea.gmane.org>
	<Pine.LNX.4.58.0505041437100.4786@server1.LFW.org>
Message-ID: <d5b8f8$vjf$1@sea.gmane.org>

Ka-Ping Yee wrote:

> Reinhold Birkenfeld wrote:
>> Well, with it you could create suites with _any_ introducing
>> identifier. Consider:
>>
>> with:
>>     (...)
>>
>> synchronized:
>>     (...)
>>
>> try:
>>     (...)
>>
>> transaction:
>>     (...)
>>
>> Do you understand my concern? It would be very, very hard to discern
>> these "user-defined statements" from real language constructs.
> 
> I think part of the debate is about whether that's good or bad.
> I happen to agree with you -- i think a keyword is necessary --
> but i believe some people see an advantage in having the flexibility
> to make a "real-looking" construct.

Yes. But it would only be crippled, as the "keyword" would have to be a
pre-constructed generator instance which cannot be easily reused as a
library export (at least, it is not intended this way).

> As i see it the argument boils down to: Python is not Lisp.
> 
> There are good reasons why the language has keywords, why it
> distinguishes statements from expressions, uses indentation, and
> so on.  All of these properties cause Python programs to be made
> of familiar and easily recognizable patterns instead of degenerating
> into a homogeneous pile of syntax.

Big ACK.

Reinhold

-- 
Mail address is perfectly valid!


From shane at hathawaymix.org  Wed May  4 21:54:48 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Wed, 04 May 2005 13:54:48 -0600
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <d5b7f0$spf$1@sea.gmane.org>
References: <ca471dc20505021755518773c8@mail.gmail.com>	<20050503201400.GE30548@solar.trillke.net>	<ca471dc2050503132010abb4df@mail.gmail.com>	<20020107054513.566d74ed@localhost.localdomain>	<d5b4dr$fa8$1@sea.gmane.org>	<b348a0850505041157dfb3659@mail.gmail.com>
	<d5b7f0$spf$1@sea.gmane.org>
Message-ID: <42792888.10209@hathawaymix.org>

Reinhold Birkenfeld wrote:
> Noam Raphael wrote:
> 
>>On 5/4/05, Reinhold Birkenfeld <reinhold-birkenfeld-nospam at wolke7.net> wrote:
>>>resource = opening("file.txt")
>>>resource:
>>>    (...)
>>>
>>>The latter would have to be forbidden.
>>
>>Can you explain why it would have to be forbidden please?
> 
> 
> Well, with it you could create suites with _any_ introducing
> identifier. Consider:
> [...]
> 
> transaction:
>     (...)
> 
> 
> Do you understand my concern? It would be very, very hard to discern
> these "user-defined statements" from real language constructs.

For each block statement, it is necessary to create a *new* iterator,
since iterators that have stopped are required to stay stopped.  So at a
minimum, used-defined statements will need to call something, and thus
will have parentheses.  The parentheses might be enough to make block
statements not look like built-in keywords.

PEP 340 seems to punish people for avoiding the parentheses:

    transaction = begin_transaction()

    transaction:
        db.execute('insert 3 into mytable')

    transaction:
        db.execute('insert 4 into mytable')

I expect that only '3' would be inserted in mytable.  The second use of
the transaction iterator will immediately raise StopIteration.

Shane

From reinhold-birkenfeld-nospam at wolke7.net  Wed May  4 21:53:50 2005
From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld)
Date: Wed, 04 May 2005 21:53:50 +0200
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <42792888.10209@hathawaymix.org>
References: <ca471dc20505021755518773c8@mail.gmail.com>	<20050503201400.GE30548@solar.trillke.net>	<ca471dc2050503132010abb4df@mail.gmail.com>	<20020107054513.566d74ed@localhost.localdomain>	<d5b4dr$fa8$1@sea.gmane.org>	<b348a0850505041157dfb3659@mail.gmail.com>	<d5b7f0$spf$1@sea.gmane.org>
	<42792888.10209@hathawaymix.org>
Message-ID: <d5b8um$3rq$1@sea.gmane.org>

Shane Hathaway wrote:

> For each block statement, it is necessary to create a *new* iterator,

Right.

> since iterators that have stopped are required to stay stopped.  So at a
> minimum, used-defined statements will need to call something, and thus
> will have parentheses.  The parentheses might be enough to make block
> statements not look like built-in keywords.
> 
> PEP 340 seems to punish people for avoiding the parentheses:
> 
>     transaction = begin_transaction()
> 
>     transaction:
>         db.execute('insert 3 into mytable')
> 
>     transaction:
>         db.execute('insert 4 into mytable')
> 
> I expect that only '3' would be inserted in mytable.  The second use of
> the transaction iterator will immediately raise StopIteration.

Yes, but wouldn't you think that people would misunderstand it in this way?

Reinhold

-- 
Mail address is perfectly valid!


From bac at OCF.Berkeley.EDU  Wed May  4 22:10:21 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Wed, 04 May 2005 13:10:21 -0700
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <ca471dc205050410097355aa80@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
Message-ID: <42792C2D.3080609@ocf.berkeley.edu>

Guido van Rossum wrote:
> Nice one. Should be a piece of cake to implement. Please talk to
> peps at python.org about getting it checked into the PEP database.
> 
> I'm +1 on accepting this now -- anybody against?
> 

I'm +1.  A couple of us discussed this at PyCon during the last day of the
sprints and we all thought that it could be done, but none of us felt strong
enough to write the PEP immediately.  So thanks to Reinhold for picking up our
slack.  =)

-Brett

From tim.peters at gmail.com  Wed May  4 22:14:23 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Wed, 4 May 2005 16:14:23 -0400
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <d5b7oj$uav$1@sea.gmane.org>
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
	<1f7befae05050411416c198c54@mail.gmail.com>
	<d5b7oj$uav$1@sea.gmane.org>
Message-ID: <1f7befae05050413141e1128dc@mail.gmail.com>

[Reinhold Birkenfeld]
> ...
> I think the behaviour of the "else" clause is much harder to guess,
> mainly when used with the looping constructs.

No, that's obvious <wink>.

What about `else` mixed with try/except/finally?

try:
    A
except:
    B
else:
    C
finally:
    D

If A executes without exception, does D execute before or after C?

I'm not saying we can't make up reasonable answers.  I'm saying they
look more-or-less arbitrary, while the current nested forms are always
clear.

From shane at hathawaymix.org  Wed May  4 22:23:41 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Wed, 04 May 2005 14:23:41 -0600
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <d5b8um$3rq$1@sea.gmane.org>
References: <ca471dc20505021755518773c8@mail.gmail.com>	<20050503201400.GE30548@solar.trillke.net>	<ca471dc2050503132010abb4df@mail.gmail.com>	<20020107054513.566d74ed@localhost.localdomain>	<d5b4dr$fa8$1@sea.gmane.org>	<b348a0850505041157dfb3659@mail.gmail.com>	<d5b7f0$spf$1@sea.gmane.org>	<42792888.10209@hathawaymix.org>
	<d5b8um$3rq$1@sea.gmane.org>
Message-ID: <42792F4D.4010706@hathawaymix.org>

Reinhold Birkenfeld wrote:
> Shane Hathaway wrote:
> 
> 
>>For each block statement, it is necessary to create a *new* iterator,
> 
> 
> Right.
> 
> 
>>since iterators that have stopped are required to stay stopped.  So at a
>>minimum, used-defined statements will need to call something, and thus
>>will have parentheses.  The parentheses might be enough to make block
>>statements not look like built-in keywords.
>>
>>PEP 340 seems to punish people for avoiding the parentheses:
>>
>>    transaction = begin_transaction()
>>
>>    transaction:
>>        db.execute('insert 3 into mytable')
>>
>>    transaction:
>>        db.execute('insert 4 into mytable')
>>
>>I expect that only '3' would be inserted in mytable.  The second use of
>>the transaction iterator will immediately raise StopIteration.
> 
> 
> Yes, but wouldn't you think that people would misunderstand it in this way?

Yes, they might.  Just to be clear, the risk is that people will try to
write statements without parentheses and get burned because their code
doesn't get executed, right?

A possible workaround is to identify iterators that have already
finished.  StopIteration doesn't distinguish between an iterator that
never yields any values from an iterator that has yielded all of its
values.  Maybe there should be a subclass of StopIteration like
"AlreadyStoppedIteration".  Then, if a block statement gets an
AlreadyStoppedIteration exception from its iterator, it should convert
that to an error like "InvalidBlockError".

Shane

From gjc at inescporto.pt  Wed May  4 22:21:15 2005
From: gjc at inescporto.pt (Gustavo J. A. M. Carneiro)
Date: Wed, 04 May 2005 21:21:15 +0100
Subject: [Python-Dev] PEP 340: propose to get rid of 'as' keyword
In-Reply-To: <42791DCB.5050703@hathawaymix.org>
References: <1115226841.7909.24.camel@localhost>
	<42791DCB.5050703@hathawaymix.org>
Message-ID: <1115238075.10836.4.camel@emperor>

On Wed, 2005-05-04 at 13:08 -0600, Shane Hathaway wrote:
> Gustavo J. A. M. Carneiro wrote:
> >   In PEP 340 I read:
> > 
> >         block EXPR1 as VAR1:
> >             BLOCK1
> > 
> >   I think it would be much clearer this (plus you save one keyword):
> > 
> >         block VAR1 = EXPR1:
> >             BLOCK1
> 
> I think you misunderstood the statement.  EXPR1 creates an iterator,
> then VAR1 iterates over the values returns by the iterator.  VAR1 never
> sees the iterator.  Using your syntax would reinforce the
> misinterpretation that VAR1 sees the iterator.

  In that case,

      block VAR1 in EXPR1:
          BLOCK1

  And now I see how using 'for' statements (perhaps slightly changed)
turned up in the discussion.

  Sorry for the noise.

-- 
Gustavo J. A. M. Carneiro
<gjc at inescporto.pt> <gustavo at users.sourceforge.net>
The universe is always one step beyond logic


From skip at pobox.com  Wed May  4 22:27:33 2005
From: skip at pobox.com (Skip Montanaro)
Date: Wed, 4 May 2005 15:27:33 -0500
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <1f7befae05050413141e1128dc@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
	<1f7befae05050411416c198c54@mail.gmail.com>
	<d5b7oj$uav$1@sea.gmane.org>
	<1f7befae05050413141e1128dc@mail.gmail.com>
Message-ID: <17017.12341.491136.991788@montanaro.dyndns.org>


    Tim> What about `else` mixed with try/except/finally?

    Tim> try:
    Tim>     A
    Tim> except:
    Tim>     B
    Tim> else:
    Tim>     C
    Tim> finally:
    Tim>     D

    Tim> If A executes without exception, does D execute before or after C?

According to Guido, execution is A, C, D in the normal case and A, B, D in
the exceptional case.  Execution never jumps back.

    Tim> I'm not saying we can't make up reasonable answers.  I'm saying
    Tim> they look more-or-less arbitrary, while the current nested forms
    Tim> are always clear.

As far as arbitrary answers go, execution only in the forward direction
seems more reasonable than jumping forward and back.

Skip

From reinhold-birkenfeld-nospam at wolke7.net  Wed May  4 22:28:23 2005
From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld)
Date: Wed, 04 May 2005 22:28:23 +0200
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <1f7befae05050413141e1128dc@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>	<ca471dc205050410097355aa80@mail.gmail.com>	<1f7befae05050411416c198c54@mail.gmail.com>	<d5b7oj$uav$1@sea.gmane.org>
	<1f7befae05050413141e1128dc@mail.gmail.com>
Message-ID: <d5bavg$fjh$1@sea.gmane.org>

Tim Peters wrote:
> [Reinhold Birkenfeld]
>> ...
>> I think the behaviour of the "else" clause is much harder to guess,
>> mainly when used with the looping constructs.
> 
> No, that's obvious <wink>.

OK, I'm persuaded. Well you wield the Force, master.

> What about `else` mixed with try/except/finally?
> 
> try:
>     A
> except:
>     B
> else:
>     C
> finally:
>     D
> 
> If A executes without exception, does D execute before or after C?

Given the order of the clauses, is it so ambiguous?

Reinhold

-- 
Mail address is perfectly valid!


From aahz at pythoncraft.com  Wed May  4 22:52:27 2005
From: aahz at pythoncraft.com (Aahz)
Date: Wed, 4 May 2005 13:52:27 -0700
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <79990c6b0505040827941ff0@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
	<79990c6b0505040827941ff0@mail.gmail.com>
Message-ID: <20050504205226.GA25641@panix.com>

On Wed, May 04, 2005, Paul Moore wrote:
>
> Yes, that'd do. I can't say I think it would be common, but it's a
> valid case. And the workaround is the usual messy flag variable:
> 
> for name in filenames:
>     found = False
>     opening(name) as f:
>         if f.read(2) == 0xFEB0: found = True
>     if found: break

My standard workaround is using exceptions, but I'm not sure how that
interacts with a block:

    try:
        for name in filenames:
            with opened(name) as f:
                if f.read(2) == 0xFEB0:
                    raise Found
    except Found:
        pass
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"It's 106 miles to Chicago.  We have a full tank of gas, a half-pack of
cigarettes, it's dark, and we're wearing sunglasses."  "Hit it."

From rodsenra at gpr.com.br  Wed May  4 23:29:47 2005
From: rodsenra at gpr.com.br (Rodrigo Dias Arruda Senra)
Date: Wed, 4 May 2005 18:29:47 -0300
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <d5b7f0$spf$1@sea.gmane.org>
References: <ca471dc20505021755518773c8@mail.gmail.com>
	<20050503201400.GE30548@solar.trillke.net>
	<ca471dc2050503132010abb4df@mail.gmail.com>
	<20020107054513.566d74ed@localhost.localdomain>
	<d5b4dr$fa8$1@sea.gmane.org>
	<b348a0850505041157dfb3659@mail.gmail.com>
	<d5b7f0$spf$1@sea.gmane.org>
Message-ID: <20050504182947.6b95773a@localhost.localdomain>

[ Reinhold Birkenfeld ]:
> Well, with it you could create suites with _any_ introducing
> identifier. Consider:
> 
> with:
>     (...)
> 
> synchronized:
>     (...)
> 
> try:
>     (...)
> 
> transaction:
>     (...)
> 
> 
> Do you understand my concern? 

I definetely see your point. However,...

> It would be very, very hard to discern
> these "user-defined statements" from real language constructs.

 - today it is hard to distinguish a user-defined function
   from a builtin function. What is the problem with the
   former (anonymous blocks) that is accepted for the later (functions).
   I guess the solution is the same for both: documentation.

 - 'try' would probably be an invalid identifier/expression in a block,
    as well as any other reserved words. So no confusion arises
    from '''try:''' nor '''while''' nor '''for''' nor '''except''' etc

[ Ka-Ping Yee ]:
> My point is   There are good reasons why the language has keywords, why it
> distinguishes statements from expressions, uses indentation, and
> so on.  All of these properties cause Python programs to be made
> of familiar and easily recognizable patterns instead of degenerating
> into a homogeneous pile of syntax.

 I am completely in favour of preserving Python's readability and simplicity.

 But metaclasses and decorators introduced opportunities for some magical
 spells. Either you know their definition and how they modify its subjects
 or your code understanding might be harmed to a certain degree. 

 They were born without being baptized with a keyword, why should blocks ?
 I think that the absence of 'name clashing', alone, is the strong argument
 in favour  of the __no keyword__  proposal. 

 Recognizing a __no keyword__ block would be very easy. If you did not
 recognize it as something you already knew, then it was a block <0.2 wink>.

 best regards,
 Senra

-- 
Rodrigo Senra                 
--
MSc Computer Engineer    rodsenra(at)gpr.com.br  
GPr Sistemas Ltda        http://www.gpr.com.br/ 
Personal Blog     http://rodsenra.blogspot.com/


From reinhold-birkenfeld-nospam at wolke7.net  Wed May  4 23:31:53 2005
From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld)
Date: Wed, 04 May 2005 23:31:53 +0200
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <20050504205226.GA25641@panix.com>
References: <20050503150510.GA13595@onegeek.org>
	<427797D5.8030207@cirad.fr>	<17015.39213.522060.873605@montanaro.dyndns.org>	<ca471dc20505031013287a2e92@mail.gmail.com>	<17015.48830.223391.390538@montanaro.dyndns.org>	<ca471dc2050503113127f938b0@mail.gmail.com>	<d58tkb$fvp$1@sea.gmane.org>	<79990c6b050504015762d004ac@mail.gmail.com>	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>	<79990c6b0505040827941ff0@mail.gmail.com>
	<20050504205226.GA25641@panix.com>
Message-ID: <d5bemh$hi2$1@sea.gmane.org>

Aahz wrote:
> On Wed, May 04, 2005, Paul Moore wrote:
>>
>> Yes, that'd do. I can't say I think it would be common, but it's a
>> valid case. And the workaround is the usual messy flag variable:
>> 
>> for name in filenames:
>>     found = False
>>     opening(name) as f:
>>         if f.read(2) == 0xFEB0: found = True
>>     if found: break
> 
> My standard workaround is using exceptions, but I'm not sure how that
> interacts with a block:
> 
>     try:
>         for name in filenames:
>             with opened(name) as f:
>                 if f.read(2) == 0xFEB0:
>                     raise Found
>     except Found:
>         pass

>From a naive point of view, it should definitely work as expected.
>From the PEP point of view, no clue. *hope*

Reinhold

-- 
Mail address is perfectly valid!


From martin at v.loewis.de  Thu May  5 00:00:34 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Thu, 05 May 2005 00:00:34 +0200
Subject: [Python-Dev] Py_UNICODE madness
In-Reply-To: <8c00ac9db9a6be3e4a937eb6290c9838@opnet.com>
References: <ff0cffe6186fb31e1f3b5c9b3a252814@opnet.com>	<ca471dc20505031544373d8c14@mail.gmail.com>
	<8c00ac9db9a6be3e4a937eb6290c9838@opnet.com>
Message-ID: <42794602.7080609@v.loewis.de>

Nicholas Bastin wrote:
> That makes PyUnicode_FromUnicode() a lot less useful.  Well, really, 
> not useful at all.

Why do you say that? Py_UNICODE is as good a type to store characters
as any other, and if you happen to have a Py_UNICODE[], you can use
that function to build a unicode object.

> You might suggest that PyUnicode_FromWideChar is more useful, but 
> that's only true on platforms that support wchar_t.

Useful to do what? PyInt_FromLong isn't useful if you have a void*,
either...

> Is there no universally supported way of moving buffers of unicode data 
> (as common data types, like unsigned short, etc.) into Python from C?

There is no common Unicode type in C, period (be it Python or not).
Your best bet is to prepare a Py_UNICODE[], either by copying from
your favourite Unicode type, or by casting it, e.g.

#if Py_UNICODE_IS_AS_WIDE_AS_MY_UNICODE_TYPE

   Py_UNICODE* data = (Py_UNICODE*) my_data;
   do_free=0;

#else

   Py_UNICODE* data = malloc(sizeof(Py_UNICODE)*my_data_len);
   for(int i=0;i<=my_data_len)
       data[i] = my_data[i];
   do_free=1;
#endif

  PyObject *uni = PyUnicode_FromUnicode(data);
  if(do_free)free(data);

Regards,
Martin

From martin at v.loewis.de  Thu May  5 00:03:37 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Thu, 05 May 2005 00:03:37 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <19b53d82c6eb11419ddf4cb529241f64@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>
	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>
Message-ID: <427946B9.6070500@v.loewis.de>

Nicholas Bastin wrote:
> "This type represents the storage type which is used by Python 
> internally as the basis for holding Unicode ordinals.  Extension module 
> developers should make no assumptions about the size of this type on 
> any given platform."

But people want to know "Is Python's Unicode 16-bit or 32-bit?"
So the documentation should explicitly say "it depends".

Regards,
Martin

From shane at hathawaymix.org  Thu May  5 00:20:45 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Wed, 04 May 2005 16:20:45 -0600
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427946B9.6070500@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>
	<427946B9.6070500@v.loewis.de>
Message-ID: <42794ABD.2080405@hathawaymix.org>

Martin v. L?wis wrote:
> Nicholas Bastin wrote:
> 
>>"This type represents the storage type which is used by Python 
>>internally as the basis for holding Unicode ordinals.  Extension module 
>>developers should make no assumptions about the size of this type on 
>>any given platform."
> 
> 
> But people want to know "Is Python's Unicode 16-bit or 32-bit?"
> So the documentation should explicitly say "it depends".

On a related note, it would be help if the documentation provided a
little more background on unicode encoding.  Specifically, that UCS-2 is
not the same as UTF-16, even though they're both two bytes wide and most
of the characters are the same.  UTF-16 can encode 4 byte characters,
while UCS-2 can't.  A Py_UNICODE is either UCS-2 or UCS-4.  It took me
quite some time to figure that out so I could produce a patch [1]_ for
PyXPCOM  that fixes its unicode support.

.. [1] https://bugzilla.mozilla.org/show_bug.cgi?id=281156

Shane

From shane.holloway at ieee.org  Thu May  5 00:45:58 2005
From: shane.holloway at ieee.org (Shane Holloway (IEEE))
Date: Wed, 04 May 2005 16:45:58 -0600
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <17017.12341.491136.991788@montanaro.dyndns.org>
References: <d59vll$4qf$1@sea.gmane.org>	<ca471dc205050410097355aa80@mail.gmail.com>	<1f7befae05050411416c198c54@mail.gmail.com>	<d5b7oj$uav$1@sea.gmane.org>	<1f7befae05050413141e1128dc@mail.gmail.com>
	<17017.12341.491136.991788@montanaro.dyndns.org>
Message-ID: <427950A6.6090408@ieee.org>

And per the PEP, I think the explaining that::

     try:
         A
     except:
         B
     else:
         C
     finally:
         D

is *exactly* equivalent to::

     try:
         try:
             A
         except:
             B
         else:
             C
     finally:
         D

Resolved all the questions about control flow for me.  Well, assuming 
that implementation makes the explanation truth.  ;)

From m.u.k.2 at gawab.com  Thu May  5 00:51:30 2005
From: m.u.k.2 at gawab.com (M.Utku K.)
Date: Wed, 4 May 2005 22:51:30 +0000 (UTC)
Subject: [Python-Dev] Need to hook Py_FatalError
References: <Xns964B980C943F1token@80.91.229.5>
Message-ID: <Xns964D156E73E25token@80.91.229.5>

Hi,

Added the patch to the patch manager on SF.


Best regards.


From tdelaney at avaya.com  Thu May  5 01:28:01 2005
From: tdelaney at avaya.com (Delaney, Timothy C (Timothy))
Date: Thu, 5 May 2005 09:28:01 +1000
Subject: [Python-Dev] PEP 340 -- concept clarification
Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE72127A@au3010avexu1.global.avaya.com>

Nick Coghlan wrote:

> Ah, someone else did post this idea first :)

I knew I was standing on the shoulders of others :)

> To deal with the generator issue, one option would be to follow up on
> Phillip's idea of a decorator to convert a generator (or perhaps any
> standard iterator) into a block iterator.
> 
> I think this would also do wonders for emphasising the difference
> between for loops and block statements.

I think if we are going to emphasise the difference, a decorator does
not go far enough. To use a decorator, this *must* be valid syntax::

    def gen():
        try:
            yield
        finally:
            print 'Done!'

However, that generator cannot be properly used in a for-loop. So it's
only realistically valid with the decorator, and used in a block
statement (resource suite ;)

My feeling is that the above should be a SyntaxError, as it currently
is, and that a new keyword is needed which explicitly allows the above,
and creates an object conforming to the resource protocal (as I called
it).

Tim Delaney

From shane.holloway at ieee.org  Thu May  5 01:29:39 2005
From: shane.holloway at ieee.org (Shane Holloway (IEEE))
Date: Wed, 04 May 2005 17:29:39 -0600
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <42792888.10209@hathawaymix.org>
References: <ca471dc20505021755518773c8@mail.gmail.com>	<20050503201400.GE30548@solar.trillke.net>	<ca471dc2050503132010abb4df@mail.gmail.com>	<20020107054513.566d74ed@localhost.localdomain>	<d5b4dr$fa8$1@sea.gmane.org>	<b348a0850505041157dfb3659@mail.gmail.com>	<d5b7f0$spf$1@sea.gmane.org>
	<42792888.10209@hathawaymix.org>
Message-ID: <42795AE3.10808@ieee.org>

Shane Hathaway wrote:
> For each block statement, it is necessary to create a *new* iterator,
> since iterators that have stopped are required to stay stopped.  So at a
> minimum, used-defined statements will need to call something, and thus
> will have parentheses.  The parentheses might be enough to make block
> statements not look like built-in keywords.

Definitely true for generators.  Not necessarily true for iterators in 
general::

     class Example(object):
         value = 0
         result = False

         def __iter__(self):
             return self

         def next(self):
             self.result = not self.result
             if self.result:
                 self.value += 1
                 return self.value
             else:
                 raise StopIteration()

::

     >>> e = Example()
     >>> list(e)
     [1]
     >>> list(e)
     [2]
     >>> list(e)
     [3]


It might actually be workable in the transaction scenario, as well as 
others.  I'm not sure if I love or hate the idea though.

Another thing.  In the specification of the Anonymous Block function, is 
there a reason that "itr = EXPR1" instead of "itr = iter(EXPR1)"?  It 
seems to be a dis-symmetry with the 'for' loop specification.

Thanks,
-Shane (Holloway)  ;)

From tdelaney at avaya.com  Thu May  5 01:37:18 2005
From: tdelaney at avaya.com (Delaney, Timothy C (Timothy))
Date: Thu, 5 May 2005 09:37:18 +1000
Subject: [Python-Dev] PEP 340: Breaking out.
Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE025204E5@au3010avexu1.global.avaya.com>

Aahz wrote:

> My standard workaround is using exceptions, but I'm not sure how that
> interacts with a block:
> 
>     try:
>         for name in filenames:
>             with opened(name) as f:
>                 if f.read(2) == 0xFEB0:
>                     raise Found
>     except Found:
>         pass

For any sane block iterator, it will work as expected. However, an evil
block iterator could suppress the `Found` exception.

A sane block iterator should re-raise the original exception.

Tim Delaney

From gvanrossum at gmail.com  Thu May  5 01:56:41 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Wed, 4 May 2005 16:56:41 -0700
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <ca471dc20505021755518773c8@mail.gmail.com>
References: <ca471dc20505021755518773c8@mail.gmail.com>
Message-ID: <ca471dc2050504165676d78703@mail.gmail.com>

I'm forced by my day job to temporarily withdraw from the discussion
about PEP 340 (I've used up my Python quota for the next several
weeks).

If agreement is reached in python-dev to suggest specific changes to
the PEP, please let me know via mail sent directly to me and not cc'ed
to python-dev. But please only if there is broad agreement on
something.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From tim.peters at gmail.com  Thu May  5 02:29:07 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Wed, 4 May 2005 20:29:07 -0400
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <427950A6.6090408@ieee.org>
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
	<1f7befae05050411416c198c54@mail.gmail.com>
	<d5b7oj$uav$1@sea.gmane.org>
	<1f7befae05050413141e1128dc@mail.gmail.com>
	<17017.12341.491136.991788@montanaro.dyndns.org>
	<427950A6.6090408@ieee.org>
Message-ID: <1f7befae0505041729512ca505@mail.gmail.com>

[Shane Holloway]
> And per the PEP, I think the explaining that::
> 
>     try:
>         A
>     except:
>         B
>     else:
>         C
>     finally:
>         D
>
> is *exactly* equivalent to::
> 
>     try:
>         try:
>             A
>         except:
>             B
>         else:
>             C
>     finally:
>         D
> 
> Resolved all the questions about control flow for me.  Well, assuming
> that implementation makes the explanation truth.  ;)

Yup!  It's not unreasonable to abbreviate it, but the second form is
obvious on the face of it, and can already be written.  I'm neutral on
adding the slightly muddier shortcut.

From edcjones at comcast.net  Thu May  5 05:37:20 2005
From: edcjones at comcast.net (Edward C. Jones)
Date: Wed, 04 May 2005 23:37:20 -0400
Subject: [Python-Dev] Adding DBL_MANTISSA and such to Python
Message-ID: <427994F0.6020504@comcast.net>

Recently I needed some information about the floating point numbers on 
my machine. So I wrote a tiny C99 program with the line

printf("%a\n", DBL_EPSILON);

The answer was "0x1p-52".

A search of comp.lang.python shows that I was not alone. Here are some 
ideas.

1. Add to Python the constants in "float.h" and "limits.h".

2. Add the C99 "%a" format to the "%" operator for strings and allow it 
in floating point literals.

3. Add full "tostring" and "fromstring" capabilities for Python numeric 
types. "tostring(x)" would return a string containing the binary 
representation of x. For example, if x is a Python float, "tostring(x)" 
would have eight characters. "fromstring(s, atype)" does the reserve, so
     fromstring(tostring(x), type(x)) == x

4. Add some functions that process floating point types at a low level. 
I suggest borrowing from C
     (mantissa, exponent) = frexp(x)
where mantissa is a float and exponent is an int. The mantissa can be 
0.0 or 0.5 <= mantissa < 1.0. Also x = mamtissa * 2**exponent. If
x == 0.0, the function returns (0.0, 0). (This is almost a quote from 
Harbison and Steele.)

5. Add the C99 constants and functions involving special floating point 
values: "FP_INFINITE", "FP_NAN", "FP_NORMAL", "FP_SUBNORMAL", "FP_ZERO", 
"fpclassify", "isfinite", "isinf", "isnan", "isnormal", "signbit", 
"copysign", "nan", "nextafter", and "nexttoward". There has been 
controversy about these in the past, but I am in favor of them. The 
documentation should discuss portability.


From greg.ewing at canterbury.ac.nz  Thu May  5 06:33:03 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 05 May 2005 16:33:03 +1200
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <42792888.10209@hathawaymix.org>
References: <ca471dc20505021755518773c8@mail.gmail.com>
	<20050503201400.GE30548@solar.trillke.net>
	<ca471dc2050503132010abb4df@mail.gmail.com>
	<20020107054513.566d74ed@localhost.localdomain>
	<d5b4dr$fa8$1@sea.gmane.org>
	<b348a0850505041157dfb3659@mail.gmail.com> <d5b7f0$spf$1@sea.gmane.org>
	<42792888.10209@hathawaymix.org>
Message-ID: <4279A1FF.9030308@canterbury.ac.nz>

Shane Hathaway wrote:
> For each block statement, it is necessary to create a *new* iterator,
> since iterators that have stopped are required to stay stopped.  So at a
> minimum, used-defined statements will need to call something, and thus
> will have parentheses.

Not necessarily!

   class Frobbing:

     def __neg__(self):
       begin_frobbing()
       try:
         yield
       finally:
        end_frobbing()

   frobbing = Frobbing()

   ...

   -frobbing:
     do_something()

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From shane at hathawaymix.org  Thu May  5 06:36:43 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Wed, 04 May 2005 22:36:43 -0600
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <42795AE3.10808@ieee.org>
References: <ca471dc20505021755518773c8@mail.gmail.com>	<20050503201400.GE30548@solar.trillke.net>	<ca471dc2050503132010abb4df@mail.gmail.com>	<20020107054513.566d74ed@localhost.localdomain>	<d5b4dr$fa8$1@sea.gmane.org>	<b348a0850505041157dfb3659@mail.gmail.com>	<d5b7f0$spf$1@sea.gmane.org>	<42792888.10209@hathawaymix.org>
	<42795AE3.10808@ieee.org>
Message-ID: <4279A2DB.6040504@hathawaymix.org>

Shane Holloway (IEEE) wrote:
> Another thing.  In the specification of the Anonymous Block function, is 
> there a reason that "itr = EXPR1" instead of "itr = iter(EXPR1)"?  It 
> seems to be a dis-symmetry with the 'for' loop specification.

Hmm... yeah.  That's strange.  In fact, if it gets changed to "itr =
iter(EXPR1)", as it probably ought to, all of the existing examples will
continue to work.  It will also be safe to start block iterators with a
single variable, nullifying my argument about parentheses.

So Reinhold's examples stand, except for the "try" block, since it
clashes with a keyword.  They read well, but when something goes wrong
in the code, how would a new programmer crack these nuts?

    with:
        (...)

    synchronized:
        (...)

    transaction:
        (...)

> Thanks,
> -Shane (Holloway)  ;)

Once in a while I read a post by Shane H????way and start wondering when
I wrote it and how I could've forgotten about it.  And then I realize I
didn't. :-)

Shane

From greg.ewing at canterbury.ac.nz  Thu May  5 06:38:29 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 05 May 2005 16:38:29 +1200
Subject: [Python-Dev] PEP 340: Only for try/finally?
In-Reply-To: <740c3aec05050314544178f57f@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<740c3aec05050314544178f57f@mail.gmail.com>
Message-ID: <4279A345.8050708@canterbury.ac.nz>

BJ?rn Lindqvist wrote:

> But why stop there? Lots of functions that takes a callable as
> argument could be upgraded to use the new block syntax.

Actually, this is something that occurred to me in
potential support of a thunk implementation: It's
possible that many functions already out there which
take function arguments could *already* be used with
a thunk-based block statement, without needing to be
re-written at all.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From shane at hathawaymix.org  Thu May  5 06:51:17 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Wed, 04 May 2005 22:51:17 -0600
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <4279A1FF.9030308@canterbury.ac.nz>
References: <ca471dc20505021755518773c8@mail.gmail.com>	<20050503201400.GE30548@solar.trillke.net>	<ca471dc2050503132010abb4df@mail.gmail.com>	<20020107054513.566d74ed@localhost.localdomain>	<d5b4dr$fa8$1@sea.gmane.org>	<b348a0850505041157dfb3659@mail.gmail.com>
	<d5b7f0$spf$1@sea.gmane.org>	<42792888.10209@hathawaymix.org>
	<4279A1FF.9030308@canterbury.ac.nz>
Message-ID: <4279A645.9060004@hathawaymix.org>

Greg Ewing wrote:
> Shane Hathaway wrote:
> 
>>For each block statement, it is necessary to create a *new* iterator,
>>since iterators that have stopped are required to stay stopped.  So at a
>>minimum, used-defined statements will need to call something, and thus
>>will have parentheses.
> 
> 
> Not necessarily!
> 
>    class Frobbing:
> 
>      def __neg__(self):
>        begin_frobbing()
>        try:
>          yield
>        finally:
>         end_frobbing()
> 
>    frobbing = Frobbing()
> 
>    ...
> 
>    -frobbing:
>      do_something()

Larry Wall would hire you in a heartbeat. ;-)

Maybe there's really no way to prevent people from writing cute but
obscure block statements.  A keyword like "block" or "suite" would give
the reader something firm to hold on to.

Shane

From greg.ewing at canterbury.ac.nz  Thu May  5 07:12:07 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 05 May 2005 17:12:07 +1200
Subject: [Python-Dev] PEP 340 -- Clayton's keyword?
In-Reply-To: <42792888.10209@hathawaymix.org>
References: <ca471dc20505021755518773c8@mail.gmail.com>
	<20050503201400.GE30548@solar.trillke.net>
	<ca471dc2050503132010abb4df@mail.gmail.com>
	<20020107054513.566d74ed@localhost.localdomain>
	<d5b4dr$fa8$1@sea.gmane.org>
	<b348a0850505041157dfb3659@mail.gmail.com> <d5b7f0$spf$1@sea.gmane.org>
	<42792888.10209@hathawaymix.org>
Message-ID: <4279AB27.3010405@canterbury.ac.nz>

How about user-defined keywords?

Suppose you could write

   statement opening

   def opening(path, mode):
     f = open(path, mode)
     try:
       yield
     finally:
       close(f)

which would then allow

   opening "myfile", "w" as f:
     do_something_with(f)

The 'statement' statement declares to the parser that an
identifier is to be treated as a keyword introducing a
block statement when it appears as the first token in a
statement.

This would allow keywordless block-statements that look
very similar to built-in statements, without any danger of
forgetting to make a function call, since a call would
be implicit in all such block-statements.

A 'statement' declaration would be needed in all modules
which use the generator, e.g.

   statement opening
   from filestuff import opening

For convenience, this could be abbreviated to

   from filestuff import statement opening

There could also be an abbreviation

   def statement opening(...):
     ...

for when you're defining and using it in the same module.

Sufficiently smart editors would understand the 'statement'
declarations and highlight accordingly, making these
user-defined statements look even more like the native
ones.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From jbone at place.org  Thu May  5 07:33:23 2005
From: jbone at place.org (Jeff Bone)
Date: Thu, 5 May 2005 00:33:23 -0500
Subject: [Python-Dev] PEP 340 -- Clayton's keyword?
In-Reply-To: <4279AB27.3010405@canterbury.ac.nz>
References: <ca471dc20505021755518773c8@mail.gmail.com>
	<20050503201400.GE30548@solar.trillke.net>
	<ca471dc2050503132010abb4df@mail.gmail.com>
	<20020107054513.566d74ed@localhost.localdomain>
	<d5b4dr$fa8$1@sea.gmane.org>
	<b348a0850505041157dfb3659@mail.gmail.com>
	<d5b7f0$spf$1@sea.gmane.org> <42792888.10209@hathawaymix.org>
	<4279AB27.3010405@canterbury.ac.nz>
Message-ID: <2BA5DA3C-BF3D-4B6B-A53B-EAFDCDF8CAE8@place.org>


+1

This is awesome.

BTW, did we totally abandon the question of using block: as RHS?

jb

On May 5, 2005, at 12:12 AM, Greg Ewing wrote:

> How about user-defined keywords?
>
> Suppose you could write
>
>    statement opening
>
>    def opening(path, mode):
>      f = open(path, mode)
>      try:
>        yield
>      finally:
>        close(f)
>
> which would then allow
>
>    opening "myfile", "w" as f:
>      do_something_with(f)
>
> The 'statement' statement declares to the parser that an
> identifier is to be treated as a keyword introducing a
> block statement when it appears as the first token in a
> statement.
>
> This would allow keywordless block-statements that look
> very similar to built-in statements, without any danger of
> forgetting to make a function call, since a call would
> be implicit in all such block-statements.
>
> A 'statement' declaration would be needed in all modules
> which use the generator, e.g.
>
>    statement opening
>    from filestuff import opening
>
> For convenience, this could be abbreviated to
>
>    from filestuff import statement opening
>
> There could also be an abbreviation
>
>    def statement opening(...):
>      ...
>
> for when you're defining and using it in the same module.
>
> Sufficiently smart editors would understand the 'statement'
> declarations and highlight accordingly, making these
> user-defined statements look even more like the native
> ones.
>
> --  
> Greg Ewing, Computer Science Dept,  
> +--------------------------------------+
> University of Canterbury,       | A citizen of NewZealandCorp,  
> a      |
> Christchurch, New Zealand       | wholly-owned subsidiary of USA  
> Inc.  |
> greg.ewing at canterbury.ac.nz        
> +--------------------------------------+
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: http://mail.python.org/mailman/options/python-dev/jbone 
> %40place.org
>


From shane at hathawaymix.org  Thu May  5 08:18:38 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Thu, 05 May 2005 00:18:38 -0600
Subject: [Python-Dev] [OT] Re:  PEP 340 -- Clayton's keyword?
In-Reply-To: <2BA5DA3C-BF3D-4B6B-A53B-EAFDCDF8CAE8@place.org>
References: <ca471dc20505021755518773c8@mail.gmail.com>	<20050503201400.GE30548@solar.trillke.net>	<ca471dc2050503132010abb4df@mail.gmail.com>	<20020107054513.566d74ed@localhost.localdomain>	<d5b4dr$fa8$1@sea.gmane.org>	<b348a0850505041157dfb3659@mail.gmail.com>	<d5b7f0$spf$1@sea.gmane.org>
	<42792888.10209@hathawaymix.org>	<4279AB27.3010405@canterbury.ac.nz>
	<2BA5DA3C-BF3D-4B6B-A53B-EAFDCDF8CAE8@place.org>
Message-ID: <4279BABE.6040208@hathawaymix.org>

Just a little offtopic note to Jeff Bone: Jeff, every time I send a
message to Python-Dev, your "Mail.app 2.0" sends me a nasty auto-reply
that I can't quote in public.  Please stop.  Since I can't seem to reach
you by email, I'm trying to reach you through this mailing list.  The
note refers to something about "Shantar"; maybe that will help you
figure out what's wrong.

Shane

From martin at v.loewis.de  Thu May  5 08:39:54 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Thu, 05 May 2005 08:39:54 +0200
Subject: [Python-Dev] Adding DBL_MANTISSA and such to Python
In-Reply-To: <427994F0.6020504@comcast.net>
References: <427994F0.6020504@comcast.net>
Message-ID: <4279BFBA.2020608@v.loewis.de>

Edward C. Jones wrote:
> The documentation should discuss portability.

This is the critical issue here. Discussing portability is not
enough; these features really ought to be either available on
a majority of the installations, or not available at all.
In particular, they would need to be available on Windows.
I haven't check whether VC 7.1 provides them, and if it doesn't,
somebody would have to provide a "direct" implementation.

I'd say "contributions are welcome".

Regards,
Martin

From python-dev at zesty.ca  Thu May  5 08:59:42 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Thu, 5 May 2005 01:59:42 -0500 (CDT)
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <338366A6D2E2CA4C9DAEAE652E12A1DE025204E5@au3010avexu1.global.avaya.com>
References: <338366A6D2E2CA4C9DAEAE652E12A1DE025204E5@au3010avexu1.global.avaya.com>
Message-ID: <Pine.LNX.4.58.0505050150350.4786@server1.LFW.org>

On Thu, 5 May 2005, Delaney, Timothy C (Timothy) wrote:
> Aahz wrote:
> > My standard workaround is using exceptions, but I'm not sure how that
> > interacts with a block:
> >
> >     try:
> >         for name in filenames:
> >             with opened(name) as f:
> >                 if f.read(2) == 0xFEB0:
> >                     raise Found
> >     except Found:
> >         pass
>
> For any sane block iterator, it will work as expected. However, an evil
> block iterator could suppress the `Found` exception.

I was thinking about more use cases for the block statement,
and one of the ideas was an exception-logging block:

    def logged_exceptions(file):
        try:
            yield
        except Exception, value:
            file.write(repr(value) + '\n')

    block logged_exceptions(file):
        do stuff
        do stuff
        do stuff

...but then i wasn't sure whether this was supposed to be
possible in the proposed scheme.  Currently, generators do not
catch exceptions raised in the code that they yield values to,
because the target of the yield is in a higher stack frame.

This makes sense from a language design perspective, since
there is no try...finally construct lexically wrapping the thing
that raises the exception.  In current Python, for example,
this says 'caught outside generator':

    def spam_generator():
        try:
            yield 'spam'
        except ValueError, value:
            print 'caught inside generator'

    try:
        g = spam_generator()
        i = g.next()
        raise ValueError(5)
    except ValueError, value:
        print 'caught outside generator'

But now i'm confused.  Tim's words above seem to suggest that
the interior generator could actually catch exceptions raised
outside, by whatever is on the other end of the yield.

So, could a block statement really catch that exception inside?
I think it might be too confusing if it were possible.


-- ?!ng

From jcarlson at uci.edu  Thu May  5 09:27:54 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Thu, 05 May 2005 00:27:54 -0700
Subject: [Python-Dev] Adding DBL_MANTISSA and such to Python
In-Reply-To: <427994F0.6020504@comcast.net>
References: <427994F0.6020504@comcast.net>
Message-ID: <20050505000111.64BD.JCARLSON@uci.edu>


"Edward C. Jones" <edcjones at comcast.net> wrote:

> 3. Add full "tostring" and "fromstring" capabilities for Python numeric 
> types. "tostring(x)" would return a string containing the binary 
> representation of x. For example, if x is a Python float, "tostring(x)" 
> would have eight characters. "fromstring(s, atype)" does the reserve, so
>      fromstring(tostring(x), type(x)) == x

For floats:
  struct.pack("d",...)
  struct.unpack("d",...)
For 32-bit signed integers:
  struct.pack("l",...)
  struct.unpack("l",...)
For 64 bit signed integers:
  struct.pack("Q",...)
  struct.unpack("Q",...)

Heck, you can even get big-endian output on little-endian machines (or
vv.) if you want!  Or you can toss the signs on the integers, get shorts,
or even chars.

Python already has such functionality in the standard library, though
perhaps it isn't the most aptly named (being in 'struct' rather than a
'value_packing' module...though its functionality was for packing and
unpacking of c-style structs...).


Alternatively, you can create an array (using similar typecodes), and
use the .tostring() and .fromstring() mechanism.

> 4. Add some functions that process floating point types at a low level. 
> I suggest borrowing from C
>      (mantissa, exponent) = frexp(x)

What about the sign?  Here's an implementation for you that includes the
sign...

def frexp(f):
    if not isinstance(f, float):
        raise TypeError, "Requires float argument"
    
    v, = struct.unpack(">Q", struct.pack(">d", f))
    #we ignore denormalized values, NANs, and infinities...
    return v>>63, 1 + (v&(2**52-1))/(2.0**52), ((v>>52)&2047)-1023


Is that enough?  Or did you want to convert back into a float?

def inv_frexp(sign, mantissa, exponent):
    #I don't know what this is normally called in C...
    v = bool(sign)*2**63
    v += (abs(int(exponent+1023))&2047)*2**52
    v += abs(int(((mantissa-1)*2**52)))&(2**52-1)
    
    f, = struct.unpack(">d", struct.pack(">Q", v))
    return f

Yeah, there's some bit work in there, and some is merely protection
against foolish inputs, but it's not that bad.

 - Josiah


From t-meyer at ihug.co.nz  Thu May  5 09:29:02 2005
From: t-meyer at ihug.co.nz (Tony Meyer)
Date: Thu, 5 May 2005 19:29:02 +1200
Subject: [Python-Dev] python-dev Summary for 2005-04-16 through 2005-04-30
	[draft]
Message-ID: <ECBA357DDED63B4995F5C1F5CBE5B1E801B0F6FB@its-xchg4.massey.ac.nz>

Here's April Part Two.  If anyone can take their eyes of the anonymous block
threads for a moment and give this a once-over, that would be great!  Please
send any corrections or suggestions to Tim (tlesher at gmail.com), Steve
(steven.bethard at gmail.com) and/or me, rather than cluttering the list.
Ta!

======================
Summary Announcements
======================

---------------
Exploding heads
---------------

After a gentle introduction for our first summary, python-dev really let
loose last fortnight; not only with the massive PEP 340 discussion, but
also more spin-offs than a `popular`_ `TV`_ `series`_, and a few
stand-alone threads.

Nearly a week into May, and the PEP 340 talk shows no sign of abating;
this is unfortunate, since Steve's head may explode if he has to write
anything more about anonymous blocks.  Just as well there are three of us!

.. _popular: http://imdb.com/title/tt0060028/
.. _TV: http://imdb.com/title/tt0098844/
.. _series: http://imdb.com/title/tt0247082/

[TAM]

-------
PEP 340
-------

A request for anonymous blocks by Shannon -jj Behrens launched a
massive discussion about a variety of related ideas. This discussion
is split into different sections for the sake of readability, but
as the sections are extracted from basically the same discussion,
it may be easiest to read them in the following order:

1. `Localized Namespaces`_

2. `The Control Flow Management Problem`_

3. `Block Decorators`_

4. `PEP 310 Updates Requested`_

5. `Sharing Namespaces`_

6. `PEP 340 Proposed`_

[SJB]


=========
Summaries
=========

--------------------
Localized Namespaces
--------------------

Initially, the "anonymous blocks" discussion focused on introducing
statement-local namespaces as a replacement for lambda expressions.
This would have allowed localizing function definitions to a single
namespace, e.g.::

    foo = property(get_foo) where:
         def get_foo(self):
             ...

where get_foo is only accessible within the ``foo = ...`` assignment
statement. However, this proposal seemed mainly to be motivated by a
desire to avoid "namespace pollution", an issue which Guido felt was not
really that much of a problem.


Contributing threads:

- `anonymous blocks
<http://mail.python.org/pipermail/python-dev/2005-April/052717.html>`__

[SJB]


-----------------------------------
The Control Flow Management Problem
-----------------------------------

Guido suggested that if new syntax were to be introduced for "anonymous
blocks", it should address the more important problem of being able to
extract common patterns of control flow. A very typical example of such
a problem, and thus one of the recurring examples in the thread, is
that of a typical acquire/release pattern, e.g.::

    lock.acquire()
    try:
       CODE
    finally:
       lock.release()

Guido was hoping that syntactic sugar and an appropriate definition of
locking() could allow such code to be written as::

    locking(lock):
       CODE

where locking() would factor out the acquire(), try/finally and
release().  For such code to work properly, ``CODE`` would have to
execute in the enclosing namespace, so it could not easily be converted
into a def-statement.

Some of the suggested solutions to this problem:

- `Block Decorators`_

- `PEP 310 Updates Requested`_

- `Sharing Namespaces`_

- `PEP 340 Proposed`_


Contributing threads:

- `anonymous blocks
<http://mail.python.org/pipermail/python-dev/2005-April/052717.html>`__

[SJB]


----------------
Block Decorators
----------------

One of the first solutions to `The Control Flow Management Problem`_ was
"block decorators".  Block decorators were functions that accepted a
"block object" (also referred to in the thread as a "thunk"), defined a
particular control flow, and inserted calls to the block object at the
appropriate points in the control flow. Block objects would have been
much like function objects, in that they encapsulated a sequence of
statements, except that they would have had no local namespace; names
would have been looked up in their enclosing function.

Block decorators would have wrapped sequences of statements in much the
same way as function decorators wrap functions today. "Block decorators"
would have allowed locking() to be written as::

    def locking(lock):
        def block_deco(block):
            lock.acquire()
            try:
                block()
            finally:
                lock.release()
        return block_deco

and invoked as::

    @locking(lock):
        CODE

The implementation of block objects would have been somewhat
complicated if a block object was a first class object and could be
passed to other functions.  This would have required all variables used
in a block object to be "cells" (which provide slower access than
normal name lookup). Additionally, first class block objects, as a type
of callable, would have confused the meaning of the return statement -
should the return exit the block or the enclosing function?


Contributing threads:

- `anonymous blocks
<http://mail.python.org/pipermail/python-dev/2005-April/052717.html>`__

- `Anonymous blocks: Thunks or iterators?
<http://mail.python.org/pipermail/python-dev/2005-April/053093.html>`__

[SJB]

-------------------------
PEP 310 Updates Requested
-------------------------

Another suggested solution to `The Control Flow Management Problem`_
was the resuscitation of `PEP 310`_, which described a protocol for
invoking the __enter__() and __exit__() methods of an object at the
beginning and ending of a set of statements. This PEP was originally
intended mainly to address the acquisition/release problem, an example
of which is discussed in `The Control Flow Management Problem`_ as the
locking() problem. Unmodified, `PEP 310`_ could handle the locking()
problem defining locking() as::

    class locking(object):
        def __init__(self, lock):
            self.lock = lock
        def __enter__(self):
            self.lock.acquire()
        def __exit__(self):
            self.lock.release()

and invoking it as::

    with locking(lock):
        CODE

In addition, an extended version of the `PEP 310`_ protocol which
augmented the __enter__() and __exit__() methods with __except__() and
__else__() methods provided a simple syntax for some of the
transactional use cases as well.


Contributing threads:

- `PEP 310 and exceptions:
<http://mail.python.org/pipermail/python-dev/2005-April/052857.html>`__

- `__except__ use cases:
<http://mail.python.org/pipermail/python-dev/2005-April/052868.html>`__

- `Integrating PEP 310 with PEP 340
<http://mail.python.org/pipermail/python-dev/2005-April/053039.html>`__


 .. _PEP 310: http://www.python.org/peps/pep-0310.html


[SJB]


------------------
Sharing Namespaces
------------------

Jim Jewett suggested that `The Control Flow Management Problem`_ could
be solved in many cases by allowing arbitrary blocks of code to share
namespaces with other blocks of code. As injecting such arbitrary code
into a template has been traditionally done in other languages with
compile-time macros, this thread briefly discussed some of the reasons
for not wanting compile-time macros in Python, most importantly, that
Python's compiler is "too dumb" to do much at compile time.

The discussion then moved to runtime "macros", which would essentially
inject code into functions at runtime. The goal here was that the
injected code could share a namespace with the function into which it
was injected. Jim Jewett proposed a strawman implementation that would
mark includable chunks of code with a "chunk" keyword, and require
these chunks to be included using an "include" keyword.

The major problems with this approach were that names could "magically"
appear in a function after including a chunk, and that functions that
used an include statement would have dramatically slower name lookup
as Python's lookup optimizations rely on static analysis.

Contributing threads:

- `defmacro (was: Anonymous blocks):
<http://mail.python.org/pipermail/python-dev/2005-April/052854.html>`__

- `anonymous blocks vs scope-collapse:
<http://mail.python.org/pipermail/python-dev/2005-April/052973.html>`__

- `scope-collapse:
<http://mail.python.org/pipermail/python-dev/2005-April/053006.html>`__

- `anonymous blocks as scope-collapse: detailed proposal
<http://mail.python.org/pipermail/python-dev/2005-April/053104.html>`__

[SJB]


----------------
PEP 340 Proposed
----------------

In the end, Guido decided that what he really wanted as a solution to
`The Control Flow Management Problem`_ was the simplicity of something
like generators that would let him write locking() as something like::

    def locking(lock):
        lock.acquire()
        try:
            yield
        finally:
            lock.release()

and invoke it as something like::

    block locking(lock):
        CODE

where the yield statement indicates the return of control flow to
``CODE``. Unlike a generator in a normal for-loop, the generator in
such a "block statement" would need to guarantee that the finally block
was executed by the end of the block statement. For this reason, and
the fact that many felt that overloading for-loops for such non-loops
might be confusing, Guido suggested introducing "block statements" as
a new separate type of statement.

`PEP 340`_ explains the proposed implementation.  Essentially, a
block-statement takes a block-iterator object, and calls its
__next__() method in a while loop until it is exhausted, in much
the same way that a for-statement works now. However, for generators
and well-behaved block-iterators, the block-statement guarantees that
the block-iterator is exhausted, by calling the block-iterator's
__exit__() method. Block-iterator objects can also have values passed
into them using a new extended continue statement. Several people were
especially excited about this prospect as it seemed very useful for a
variety of event-driven programming techniques.

A few issues still remained unresolved at the time this summary was
written:

Should the block-iterator protocol be distinct from the current
iterator protocol? Phillip J. Eby campaigned for this, suggesting that
by treating them as two separate protocols, block-iterators could be
prevented from accidentally being used in for-loops where their cleanup
code would be silently omitted. Having two separate protocols would
also allow objects to implement both protocols if appropriate, e.g.
perhaps sometimes file objects should close themselves, and sometimes
they shouldn't. Guido seemed inclined to instead merge the two
protocols, allowing block-iterators to be used in both for-statements
and block-statements, saying that the benefits of having two very
similar but distinct protocols were too small.

What syntax should block-statements use? This one was still undecided,
with dozens of different keywords having been proposed. Syntaxes that
did not seem to have any major detractors at the time this summary was
written:

* ``EXPR as NAME: BLOCK``

* ``@EXPR as NAME: BLOCK``

* ``in EXPR as NAME: BLOCK``

* ``block EXPR as NAME: BLOCK``

* ``finalize EXPR as NAME: BLOCK``


Contributing threads:

- `anonymous blocks
<http://mail.python.org/pipermail/python-dev/2005-April/052717.html>`__

- `next(arg)
<http://mail.python.org/pipermail/python-dev/2005-April/053156.html>`__

- `PEP 340 - possible new name for block-statement
<http://mail.python.org/pipermail/python-dev/2005-April/053109.html>`__

- `PEP 340: syntax suggestion - try opening(filename) as f:
<http://mail.python.org/pipermail/python-dev/2005-April/053163.html>`__

- `Keyword for block statements
<http://mail.python.org/pipermail/python-dev/2005-April/053184.html>`__

- `About block statement name alternative
<http://mail.python.org/pipermail/python-dev/2005-April/053139.html>`__

- `Integrating PEP 310 with PEP 340
<http://mail.python.org/pipermail/python-dev/2005-April/053039.html>`__


 .. _PEP 340: http://www.python.org/peps/pep-0340.html

[SJB]


------------------
A switch statement
------------------

The switch statement (a la `PEP 275`_) came up again this month, as a
spin-off (wasn't everything?) of the `PEP 340`_ discussion.  The main
benefit of such a switch statement is avoiding Python function calls,
which are very slow compared to branching to inlined Python code. In
addition, repetition (the name of the object being compared, and the
comparison operator) is avoided.  Although Brian Beck contributed a
`switch cookbook recipe`_, the discussion didn't progress far enough to
make any changes or additions to `PEP 275`_.


Contributing threads:

- `switch statement
<http://mail.python.org/pipermail/python-dev/2005-April/052778.html>`__

 .. _PEP 275: http://www.python.org/peps/pep-0275.html
 .. _switch cookbook recipe:
http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/410692
 
[TAM]


----------------
Pattern Matching
----------------

Discussion about a dictionary-lookup switch statement branched out
to more general discussion about pattern matching statements, like
Ocaml's "match", and Haskell's case statement.  There was reasonable
interest in pattern matching, but since types are typically a key
feature of pattern matching, and many of the elegant uses of pattern
matching use recursion to traverse data structures, both of which
hinder coming up with a viable Python implementation (at least while
CPython lacks tail-recursion elimination).

The exception to this is matching strings, where regular expressions
provide a method of pattern specification, and multi-way branches
based on string content is common.

As such, it seems unlikely that any proposals for new pattern matching
language features will be made at this time.


Contributing threads:

- `switch statement
<http://mail.python.org/pipermail/python-dev/2005-April/052778.html>`__

[TAM]


-----------------------------------------
Read-only property access inconsistencies
-----------------------------------------

Barry Warsaw noticed that the exception thrown for read-only properties
of C extension types is a TypeError, while for read-only properties of
Python (new-style) classes is an AttributeError, and `wrote a patch`_ to
clean up the inconsistency.
 
Guido pronounced that this was an acceptable fix for 2.5, and so the change
was checked in.  Along the way, he wondered whether in the long-term,
AttributeError should perhaps inherit from TypeError.

          
Contributing threads:

- `Inconsistent exception for read-only properties?
<http://mail.python.org/pipermail/python-dev/2005-April/052681.html>`__

 .. _wrote a patch:
http://sourceforge.net/tracker/index.php?func=detail&aid=1184449&group_id=54
70&atid=105470 

[TAM]


--------------------
site.py enhancements
--------------------

Bob Ippolito asked for review of a `patch to site.py`_ to solve three
deficiencies for Python 2.5:

- It is no longer true that all site dirs must exist on the file system
- The directories added to sys.path by .pth files are not scanned for
further .pth files
- .pth files cannot use os.path.expanduser()

Greg Ewing suggested additionally scanning the directory containing the
main .py file for .pth files to make it easier to have collections of
Python programs sharing a common set of modules.  Possibly swamped by the
`PEP 340`_ threads, further discussion trailed off, so it's likely that
Bob's patch will be applied, but without Greg's proposal.


Contributing threads:

- `site enhancements (request for review)
<http://mail.python.org/pipermail/python-dev/2005-April/052885.html>`__

 .. _patch to site.py: http://python.org/sf/1174614

[TAM]


-------------------------------------
Passing extra arguments when building
-------------------------------------

Martin v. L?wis helped Brett C out with adding some code to facilitate a
Py_COMPILER_DEBUG build for use on the AST branch.  Specifically, an
EXTRA_CFLAGS environment variable was added to the build process, to
enable passing additional arguments to the compiler, that aren't modified
by configure, and that change binary compatibility.


Contributing threads:

- `Proper place to put extra args for building
<http://mail.python.org/pipermail/python-dev/2005-April/052746.html>`__

[TAM]


---------------------------
zipfile module improvements
---------------------------

Bob Ippolito pointed out that the "2GB bug" that was `supposed to be fixed`_
was not, and opened a `new bug and patch`_ that should fix the issue
correctly, as well as a bug that sets the ZipInfo record's platform to
Windows, regardless of the actual platform.  He suggested that someone
should consider rewriting the zipfile module to include a repair feature,
and be up-to-date with `the latest specifications`_, and include support
for Apple's zip format extensions.  Charles Hartman added deleting a file
from a zipfile to this list.

Bob suggested that one of the most useful additions to the zipfile module
would be a stream interface for reading and writing
(a `patch to read large items`_ along these lines already exists).  Guido
liked this idea and suggested that Bob rework the zipfile module, if
possible, for Python 2.5.


Contributing threads:

- `zipfile still has 2GB boundary bug
<http://mail.python.org/pipermail/python-dev/2005-April/052887.html>`__

.. _supposed to be fixed: http://python.org/sf/679953
.. _new bug and patch: http://python.org/sf/1189216
.. _the latest specifications:
http://www.pkware.com/company/standards/appnote/
.. _patch to read large items: http://python.org/sf/1121142

[TAM]


-------------------------------------
super_getattro() and attribute lookup
-------------------------------------

Phil Thompson asked a number of questions about super_getattro(), and
attribute lookup.  Michael Hudson answered these, and pointed out that
there has been `some talk`_ of having a tp_lookup slot in typeobjects.
However, he noted that he has many other pots on the boil at the moment,
so is unlikely to work on it soon.

 
Contributing threads:

- `super_getattro() Behaviour
<http://mail.python.org/pipermail/python-dev/2005-April/052655.html>`__

.. _some talk:
http://mail.python.org/pipermail/python-dev/2005-March/052150.html

[TAM]


--------------------------------
Which objects are memory cached?
--------------------------------

Facundo Batista asked about memory caching of objects (for performance
reasons), to aid in explaining how to think using name/object and not
variable/value.  In practice, ints between -5 and 100 are cached,
1-character strings are often cached, and string literals that resemble
Python identifiers are often interned.  It was noted that the reference
manual specifies that immutables *may* be cached, but that CPython
specifics, such as which objects are, are omitted so that people will
not think of them as fixed; Terry Reedy reiterated his suggestion that
implementation details such as this are documented separately, elsewhere.

Guido and Greg Ewing pointed out that when explaining it is important
for the explainees to understand that mutable objects are never in
danger of being shared.
 

Contributing threads:

- `Caching objects in memory
<http://mail.python.org/pipermail/python-dev/2005-April/052844.html>`__

[TAM]


------------------------------
Unregistering atexit functions
------------------------------

Nick Jacobson noted that while you can mark functions to be called with
at exit with the 'register' method, there's no 'unregister' method to
remove them from the stack of functions to be called.  Many suggestions
were made about how this could already be done, however, including using
try/finally, writing the cleanup routines in such a way that they could
detect reentry, passing a class to register(), and managing a list of
functions oneself.

General pearls of wisdom outlined included:

- if one devotes time to "making a case", then one should also devote equal
effort to researching the hazards and API issues.
- "Potentially useful" is usually trumped by "potentially harmful".
- if the API is awkward or error-prone, that is a bad sign.


Contributing threads:

- `atexit missing an unregister method
<http://mail.python.org/pipermail/python-dev/2005-April/052983.html>`__

[TAM]


-----------------------
Pickling buffer objects
-----------------------

Travis Oliphant proposed a patch to the pickle module to allow
pickling of the buffer object.  His use case was avoiding copying
array data into a string before writing to a file in Numeric, while
still having Numeric arrays interact seamlessly with other
pickleable types.  The proposal was to unpickle the object as a
string (since a `bytes object`_ does not exist) rather than to
mutable-byte buffer objects, to maintain backwards compatibility.
Other than a couple of questions, no objections were raised, so a
patch is likely to appear.


Contributing threads:

- `Pickling buffer objects.
<http://mail.python.org/pipermail/python-dev/2005-April/052707.html>`__

.. _bytes object: http://python.org/peps/pep-0296.html

[TAM]


===============
Skipped Threads
===============

- `Another Anonymous Block Proposal
<http://mail.python.org/pipermail/python-dev/2005-April/053031.html>`__
- `PEP 340: What is "ret" in block statement semantics?
<http://mail.python.org/pipermail/python-dev/2005-April/053140.html>`__
- `anonymous blocks (off topic: match)
<http://mail.python.org/pipermail/python-dev/2005-April/052759.html>`__
- `Reference counting when entering and exiting scopes
<http://mail.python.org/pipermail/python-dev/2005-April/052779.html>`__
- `How do you get yesterday from a time object
<http://mail.python.org/pipermail/python-dev/2005-April/052712.html>`__
- `shadow password module (spwd) is never built due to error in setup.py
<http://mail.python.org/pipermail/python-dev/2005-April/052679.html>`__
- `Problem with embedded python
<http://mail.python.org/pipermail/python-dev/2005-April/052995.html>`__
- `python.org crashing Mozilla?
<http://mail.python.org/pipermail/python-dev/2005-April/052992.html>`__
- `noob question regarding the interpreter
<http://mail.python.org/pipermail/python-dev/2005-April/053103.html>`__
- `Newish test failures
<http://mail.python.org/pipermail/python-dev/2005-April/052771.html>`__
- `Error checking in init<module> functions
<http://mail.python.org/pipermail/python-dev/2005-April/052847.html>`__
- `a few SF bugs which can (probably) be closed
<http://mail.python.org/pipermail/python-dev/2005-April/052864.html>`__
- `Check out a new way to read threaded conversations.
<http://mail.python.org/pipermail/python-dev/2005-April/052674.html>`__
- `Python 2.1 in HP-UX
<http://mail.python.org/pipermail/python-dev/2005-April/052710.html>`__
- `Python tests fails on HP-UX 11.11 and core dumps
<http://mail.python.org/pipermail/python-dev/2005-April/052652.html>`__
- `IPV6 with Python- 4.2.1 on HPUX
<http://mail.python.org/pipermail/python-dev/2005-April/052656.html>`__
- `Fwd: CFP: DLS05: ACM Dynamic Languages Symposium
<http://mail.python.org/pipermail/python-dev/2005-April/052699.html>`__
- `PyCon 2005 keynote on-line
<http://mail.python.org/pipermail/python-dev/2005-April/052684.html>`__
- `PyCallable_Check redeclaration
<http://mail.python.org/pipermail/python-dev/2005-April/052677.html>`__
- `os.urandom uses closed FD (sf 1177468)
<http://mail.python.org/pipermail/python-dev/2005-April/052716.html>`__
- `Removing --with-wctype-functions support
<http://mail.python.org/pipermail/python-dev/2005-April/052968.html>`__


From jcarlson at uci.edu  Thu May  5 09:53:58 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Thu, 05 May 2005 00:53:58 -0700
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <Pine.LNX.4.58.0505041419510.4786@server1.LFW.org>
References: <42791C50.3090107@hathawaymix.org>
	<Pine.LNX.4.58.0505041419510.4786@server1.LFW.org>
Message-ID: <20050505004244.64C0.JCARLSON@uci.edu>


Ka-Ping Yee <python-dev at zesty.ca> wrote:
> 
> On Wed, 4 May 2005, Shane Hathaway wrote:
> >
> >     for name in filenames:
> >         opening(name) as f:
> >             if f.read(2) == 0xFEB0:
> >                 break for
> 
>     continue with 2
> 


There is something about <action> <which> <level> that I just don't like.
I can't really put my finger on it right now, so perhaps it is merely
personal aesthetics.  I'll sleep on it and see if I can come up with a
good reason why I don't like it.

 - Josiah


From jcarlson at uci.edu  Thu May  5 10:10:53 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Thu, 05 May 2005 01:10:53 -0700
Subject: [Python-Dev] Adding DBL_MANTISSA and such to Python
In-Reply-To: <20050505000111.64BD.JCARLSON@uci.edu>
References: <427994F0.6020504@comcast.net>
	<20050505000111.64BD.JCARLSON@uci.edu>
Message-ID: <20050505010955.64C3.JCARLSON@uci.edu>


Josiah Carlson <jcarlson at uci.edu> wrote:

           unsigned
             vvvvvv
> For 64 bit signed integers:
>   struct.pack("Q",...)
>   struct.unpack("Q",...)

My fingers were typing too fast (I do much work with unsigned 64 bit
integers, but not much with unsigned ones).

 - Josiah


From mwh at python.net  Thu May  5 10:17:30 2005
From: mwh at python.net (Michael Hudson)
Date: Thu, 05 May 2005 09:17:30 +0100
Subject: [Python-Dev] Adding DBL_MANTISSA and such to Python
In-Reply-To: <427994F0.6020504@comcast.net> (Edward C. Jones's message of
	"Wed, 04 May 2005 23:37:20 -0400")
References: <427994F0.6020504@comcast.net>
Message-ID: <2mzmvaw0h1.fsf@starship.python.net>

"Edward C. Jones" <edcjones at comcast.net> writes:

> Recently I needed some information about the floating point numbers on 
> my machine. So I wrote a tiny C99 program with the line
>
> printf("%a\n", DBL_EPSILON);
>
> The answer was "0x1p-52".
>
> A search of comp.lang.python shows that I was not alone. Here are some 
> ideas.
>
> 1. Add to Python the constants in "float.h" and "limits.h".

Where?

> 2. Add the C99 "%a" format to the "%" operator for strings and allow it 
> in floating point literals.

Is there an implementation of this somewhere?  We mostly certainly are
not demanding a C99 compiler yet.

> 3. Add full "tostring" and "fromstring" capabilities for Python numeric 
> types. "tostring(x)" would return a string containing the binary 
> representation of x. For example, if x is a Python float, "tostring(x)" 
> would have eight characters. "fromstring(s, atype)" does the reserve, so
>      fromstring(tostring(x), type(x)) == x

We have this already in the struct module.  I have a patch that should
improve the robustness of these functions on IEEE-754 platforms in the
face of special values that you can review if you like:

    http://python.org/sf/1181301

(my not-so-recent anguished "will one of you bastards please review
this and/or 1180995 for me?" still applies, btw)

> 4. Add some functions that process floating point types at a low level. 
> I suggest borrowing from C
>      (mantissa, exponent) = frexp(x)
> where mantissa is a float and exponent is an int. The mantissa can be 
> 0.0 or 0.5 <= mantissa < 1.0. Also x = mamtissa * 2**exponent. If
> x == 0.0, the function returns (0.0, 0). (This is almost a quote from 
> Harbison and Steele.)

>>> math.frexp(math.pi)
(0.78539816339744828, 2)

What am I missing?

> 5. Add the C99 constants and functions involving special floating point 
> values: "FP_INFINITE", "FP_NAN", "FP_NORMAL", "FP_SUBNORMAL", "FP_ZERO", 
> "fpclassify", "isfinite", "isinf", "isnan", "isnormal", "signbit", 
> "copysign", "nan", "nextafter", and "nexttoward". There has been 
> controversy about these in the past, but I am in favor of them. The 
> documentation should discuss portability.

If you can supply a patch to make all the compilers out there behave
with respect to these functions, I'll be impressed (they seem to exist
on Mac OS X 10.3, dunno if they work though :).

Cheers,
mwh

-- 
  If you're talking "useful", I'm not your bot.
                                            -- Tim Peters, 08 Nov 2001

From mwh at python.net  Thu May  5 10:10:17 2005
From: mwh at python.net (Michael Hudson)
Date: Thu, 05 May 2005 09:10:17 +0100
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <427950A6.6090408@ieee.org> (Shane Holloway's message of "Wed,
	04 May 2005 16:45:58 -0600")
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
	<1f7befae05050411416c198c54@mail.gmail.com>
	<d5b7oj$uav$1@sea.gmane.org>
	<1f7befae05050413141e1128dc@mail.gmail.com>
	<17017.12341.491136.991788@montanaro.dyndns.org>
	<427950A6.6090408@ieee.org>
Message-ID: <2m4qdixfdi.fsf@starship.python.net>

"Shane Holloway (IEEE)" <shane.holloway at ieee.org> writes:

> And per the PEP, I think the explaining that::
>
>      try:
>          A
>      except:
>          B
>      else:
>          C
>      finally:
>          D
>
> is *exactly* equivalent to::
>
>      try:
>          try:
>              A
>          except:
>              B
>          else:
>              C
>      finally:
>          D
>
> Resolved all the questions about control flow for me.

Well, yes, that makes sense, but also raises a small "and the point
is...?" flag in my head.

Cheers,
mwh

-- 
  This is the fixed point problem again; since all some implementors
  do is implement the compiler and libraries for compiler writing, the
  language becomes good at writing compilers and not much else!
                                 -- Brian Rogoff, comp.lang.functional

From fredrik at pythonware.com  Thu May  5 10:11:32 2005
From: fredrik at pythonware.com (Fredrik Lundh)
Date: Thu, 5 May 2005 10:11:32 +0200
Subject: [Python-Dev] PEP 340: propose to get rid of 'as' keyword
References: <1115226841.7909.24.camel@localhost><42791DCB.5050703@hathawaymix.org>
	<1115238075.10836.4.camel@emperor>
Message-ID: <d5cnoa$ds4$1@sea.gmane.org>

Gustavo J. A. M. Carneiro wrote:

>   In that case,
>
>       block VAR1 in EXPR1:
>           BLOCK1
>
> And now I see how using 'for' statements (perhaps slightly changed)
> turned up in the discussion.

you're moving through this discussion exactly backwards; the current
proposal stems from the observation that "for-loop plus generators" in
today's Python does in fact provide a block implementation that solves
many use cases in an elegant way.

PEP 340 builds on this, sorts out a couple of weak points in the current
design, and adds an elegant syntax for most remaining use cases.

</F>




From python at rcn.com  Thu May  5 11:53:12 2005
From: python at rcn.com (Raymond Hettinger)
Date: Thu, 5 May 2005 05:53:12 -0400
Subject: [Python-Dev] my first post: asking about a "decorator" module
In-Reply-To: <4edc17eb0505042347a9d02be@mail.gmail.com>
Message-ID: <002a01c55158$64275b80$11bd2c81@oemcomputer>

> > Ultimately, some of these will likely end-up in the library.  For
the
> > time being, I think it best that these get posted and evolve either
as
> > Wiki entries or as ASPN entries.  The best practices and proven
winners
> > have yet to emerge.  Solidifying first attempts is likely not a good
> > idea.  Putting tools in the standard library should be the last
> > evolutionary step, not the first.
> 
> Yes, of course. I just wanted to know it there was interest on the
> subject.

Yes, there has been quite a bit of interest including several ASPN
recipes and a wiki:

   http://www.python.org/moin/PythonDecoratorLibrary


Raymond

From ncoghlan at gmail.com  Thu May  5 12:44:25 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 05 May 2005 20:44:25 +1000
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <d11dcfba050504083551bb0a1e@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org>
	<427797D5.8030207@cirad.fr>	<17015.39213.522060.873605@montanaro.dyndns.org>	<ca471dc20505031013287a2e92@mail.gmail.com>	<17015.48830.223391.390538@montanaro.dyndns.org>	<ca471dc2050503113127f938b0@mail.gmail.com>	<d58tkb$fvp$1@sea.gmane.org>	<79990c6b050504015762d004ac@mail.gmail.com>	<4278D7D7.2040805@gmail.com>
	<d11dcfba050504083551bb0a1e@mail.gmail.com>
Message-ID: <4279F909.6030206@gmail.com>

Steven Bethard wrote:
> Makes me wonder if we shouldn't just return to the __enter__() and
> __exit__() names of PEP 310[1] where for a generator __enter__() is
> just an alias for next().  We could even require Phillip J. Eby's
> "blockgenerator" decorator to rename next() to __enter__(), and add
> the appropriate __exit__() method.

You must be reading my mind or something. . .

Unless there is something in today's 80-odd messages to make it redundant, look 
for a post entitled something like "Minimalist PEP 340 (aka PEP 310 redux)"

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From ncoghlan at gmail.com  Thu May  5 12:55:14 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 05 May 2005 20:55:14 +1000
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
References: <20050503150510.GA13595@onegeek.org>
	<427797D5.8030207@cirad.fr>	<17015.39213.522060.873605@montanaro.dyndns.org>	<ca471dc20505031013287a2e92@mail.gmail.com>	<17015.48830.223391.390538@montanaro.dyndns.org>	<ca471dc2050503113127f938b0@mail.gmail.com>	<d58tkb$fvp$1@sea.gmane.org>	<79990c6b050504015762d004ac@mail.gmail.com>
	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
Message-ID: <4279FB92.5050501@gmail.com>

Alex Martelli wrote:
> Looking for a file with a certain magicnumber in its 1st two bytes...?
> 
> for name in filenames:
>      opening(name) as f:
>          if f.read(2) == 0xFEB0: break
> 
> This does seem to make real-life sense to me...

Also consider the vast semantic differences between:

   locking(lock):
       for item in items:
           if can_handle(item): break

   for item in items:
       locking(lock):
           if can_handle(item): break


Instead of simply acquiring and releasing the lock on each iteration as one 
might expect, moving to the latter version *also* causes every item to be 
checked, instead of only items up to the first one that can be handled. The 
break magically becomes meaningless. How does this even come close to executable 
pseudocode?

I also think another factor is that currently, instead of doing try/finally's in 
loops, there is a tendency to push the try/finally into a function, then call 
that function inside the loop. The introduction of block statements means that a 
number of those inner functions are likely to be handled as block statements 
instead - with the above highly confusing result.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From ncoghlan at gmail.com  Thu May  5 13:00:59 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 05 May 2005 21:00:59 +1000
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <42795AE3.10808@ieee.org>
References: <ca471dc20505021755518773c8@mail.gmail.com>	<20050503201400.GE30548@solar.trillke.net>	<ca471dc2050503132010abb4df@mail.gmail.com>	<20020107054513.566d74ed@localhost.localdomain>	<d5b4dr$fa8$1@sea.gmane.org>	<b348a0850505041157dfb3659@mail.gmail.com>	<d5b7f0$spf$1@sea.gmane.org>	<42792888.10209@hathawaymix.org>
	<42795AE3.10808@ieee.org>
Message-ID: <4279FCEB.4020207@gmail.com>

Shane Holloway (IEEE) wrote:
> It might actually be workable in the transaction scenario, as well as 
> others.  I'm not sure if I love or hate the idea though.

Given that this is officially a violation of the iterator protocol. . . (check 
the docs for well-behaved iterators)

> Another thing.  In the specification of the Anonymous Block function, is 
> there a reason that "itr = EXPR1" instead of "itr = iter(EXPR1)"?  It 
> seems to be a dis-symmetry with the 'for' loop specification.

Indeed - and a deliberate one, at least partly to discourage caching of block 
iterators.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From ronaldoussoren at mac.com  Thu May  5 13:13:58 2005
From: ronaldoussoren at mac.com (Ronald Oussoren)
Date: Thu, 5 May 2005 13:13:58 +0200
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <4279FB92.5050501@gmail.com>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
	<4279FB92.5050501@gmail.com>
Message-ID: <FFF043AA-1198-49A9-91E5-76128BC98933@mac.com>


On 5-mei-2005, at 12:55, Nick Coghlan wrote:

> Alex Martelli wrote:
>
>> Looking for a file with a certain magicnumber in its 1st two  
>> bytes...?
>>
>> for name in filenames:
>>      opening(name) as f:
>>          if f.read(2) == 0xFEB0: break
>>
>> This does seem to make real-life sense to me...
>>
>
> Also consider the vast semantic differences between:
>
>    locking(lock):
>        for item in items:
>            if can_handle(item): break
>
>    for item in items:
>        locking(lock):
>            if can_handle(item): break
>
>
> Instead of simply acquiring and releasing the lock on each  
> iteration as one
> might expect, moving to the latter version *also* causes every item  
> to be
> checked, instead of only items up to the first one that can be  
> handled. The
> break magically becomes meaningless. How does this even come close  
> to executable
> pseudocode?

What's bothering me about the proposed semantics is that block  
statement behaves
like a loop while most use cases do no looping whatsoever.  
Furthermore the it doesn't
feel like loop either. In all three examples on this page I'd assume  
that the break
would break out of the for loop.

Ronald

From ncoghlan at gmail.com  Thu May  5 13:46:25 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 05 May 2005 21:46:25 +1000
Subject: [Python-Dev] PEP 340 -- concept clarification
In-Reply-To: <338366A6D2E2CA4C9DAEAE652E12A1DE72127A@au3010avexu1.global.avaya.com>
References: <338366A6D2E2CA4C9DAEAE652E12A1DE72127A@au3010avexu1.global.avaya.com>
Message-ID: <427A0791.7000707@gmail.com>

Delaney, Timothy C (Timothy) wrote:
> Nick Coghlan wrote:
> I think if we are going to emphasise the difference, a decorator does
> not go far enough. To use a decorator, this *must* be valid syntax::
> 
>     def gen():
>         try:
>             yield
>         finally:
>             print 'Done!'
> 
> However, that generator cannot be properly used in a for-loop. So it's
> only realistically valid with the decorator, and used in a block
> statement (resource suite ;)
> 
> My feeling is that the above should be a SyntaxError, as it currently
> is, and that a new keyword is needed which explicitly allows the above,
> and creates an object conforming to the resource protocal (as I called
> it).

I think adding __exit__ and __del__ methods to generators will suffice - for a 
normal generator, it *will* get cleaned up eventually.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From ncoghlan at gmail.com  Thu May  5 14:32:59 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 05 May 2005 22:32:59 +1000
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <2m4qdixfdi.fsf@starship.python.net>
References: <d59vll$4qf$1@sea.gmane.org>	<ca471dc205050410097355aa80@mail.gmail.com>	<1f7befae05050411416c198c54@mail.gmail.com>	<d5b7oj$uav$1@sea.gmane.org>	<1f7befae05050413141e1128dc@mail.gmail.com>	<17017.12341.491136.991788@montanaro.dyndns.org>	<427950A6.6090408@ieee.org>
	<2m4qdixfdi.fsf@starship.python.net>
Message-ID: <427A127B.3050901@gmail.com>

Michael Hudson wrote:
> "Shane Holloway (IEEE)" <shane.holloway at ieee.org> writes:
> 
> 
>>And per the PEP, I think the explaining that::
>>
>>     try:
>>         A
>>     except:
>>         B
>>     else:
>>         C
>>     finally:
>>         D
>>
>>is *exactly* equivalent to::
>>
>>     try:
>>         try:
>>             A
>>         except:
>>             B
>>         else:
>>             C
>>     finally:
>>         D
>>
>>Resolved all the questions about control flow for me.
> 
> 
> Well, yes, that makes sense, but also raises a small "and the point
> is...?" flag in my head.

Someone writing a patch and profiling the two versions would serve to convince me :)

Cheers,
Nick.

P.S. Well, assuming the flattened version is faster. . .

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From martin at v.loewis.de  Thu May  5 14:58:02 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Thu, 05 May 2005 14:58:02 +0200
Subject: [Python-Dev] PEP 340 keyword: after
Message-ID: <427A185A.90504@v.loewis.de>

I haven't followed the PEP 340 discussion in detail,
but as the PEP doesn't list keywords that have been
considered and rejected, I'd like to propose my own:
use "after" instead of "block":

after opening("/etc/passwd") as f:
  for line in f:
     print line.rstrip()

after locking(myLock):
  # code that needs to hold the lock

Regards,
Martin

From rodsenra at gpr.com.br  Thu May  5 15:23:39 2005
From: rodsenra at gpr.com.br (Rodrigo Dias Arruda Senra)
Date: Thu, 5 May 2005 10:23:39 -0300
Subject: [Python-Dev] PEP 340 keyword: after
In-Reply-To: <427A185A.90504@v.loewis.de>
References: <427A185A.90504@v.loewis.de>
Message-ID: <20050505102339.7b745670@localhost.localdomain>

On Thu, 05 May 2005 14:58:02 +0200
"Martin v. L?wis" <martin at v.loewis.de> wrote:

> I haven't followed the PEP 340 discussion in detail,
> but as the PEP doesn't list keywords that have been
> considered and rejected, I'd like to propose my own:
> use "after" instead of "block":
> 
> after opening("/etc/passwd") as f:
>   for line in f:
>      print line.rstrip()
> 
> after locking(myLock):
>   # code that needs to hold the lock
> 

 And *after* fits very nice for the examples above.
 However, it might get weird for:

    after transaction(db):
        # code inbetween new_trasn/ commit_or_abort

 The code pattern that will 'wrap' the block might
 not always make sense with the chosen keyword, if
 that keyword is not semantically neutral.
 (not time-related, not function-related, etc).

 Notice that is _no keyword_ is chosen, nothing prevents us
 from using (even if by aliasing):

   after_opening("/etc/passwd") as f:
       for line in f:
           print line.rstrip()
 
   after_locking(myLock):
       # code that needs to hold the lock

My two cents.
Senra

-- 
Rodrigo Senra                 
--
MSc Computer Engineer    rodsenra(at)gpr.com.br  
GPr Sistemas Ltda        http://www.gpr.com.br/ 
Personal Blog     http://rodsenra.blogspot.com/


From eric.nieuwland at xs4all.nl  Thu May  5 16:36:03 2005
From: eric.nieuwland at xs4all.nl (Eric Nieuwland)
Date: Thu, 5 May 2005 16:36:03 +0200
Subject: [Python-Dev] PEP 340 -- loose ends
In-Reply-To: <d5b8um$3rq$1@sea.gmane.org>
References: <ca471dc20505021755518773c8@mail.gmail.com>	<20050503201400.GE30548@solar.trillke.net>	<ca471dc2050503132010abb4df@mail.gmail.com>	<20020107054513.566d74ed@localhost.localdomain>	<d5b4dr$fa8$1@sea.gmane.org>	<b348a0850505041157dfb3659@mail.gmail.com>	<d5b7f0$spf$1@sea.gmane.org>
	<42792888.10209@hathawaymix.org> <d5b8um$3rq$1@sea.gmane.org>
Message-ID: <6992f471550fb63f339e0a0dad6ca8a5@xs4all.nl>

Reinhold Birkenfeld wrote:
> Shane Hathaway wrote:
>> PEP 340 seems to punish people for avoiding the parentheses:
>>
>>     transaction = begin_transaction()
>>
>>     transaction:
>>         db.execute('insert 3 into mytable')
>>
>>     transaction:
>>         db.execute('insert 4 into mytable')
>>
>> I expect that only '3' would be inserted in mytable.  The second use 
>> of
>> the transaction iterator will immediately raise StopIteration.
>
> Yes, but wouldn't you think that people would misunderstand it in this 
> way?

This could be solved if the translation of

         block EXPR1 as VAR1:
             BLOCK1
would change from:
         itr = EXPR1  # The iterator
         ret = False  # True if a return statement is active
         ...etc...
to:
         itr = iter(EXPR1)  # The iterator
         ret = False  # True if a return statement is active
         ...etc...

--eric
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: text/enriched
Size: 1134 bytes
Desc: not available
Url : http://mail.python.org/pipermail/python-dev/attachments/20050505/bda648c0/attachment.bin

From eric.nieuwland at xs4all.nl  Thu May  5 16:51:46 2005
From: eric.nieuwland at xs4all.nl (Eric Nieuwland)
Date: Thu, 5 May 2005 16:51:46 +0200
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <FFF043AA-1198-49A9-91E5-76128BC98933@mac.com>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
	<4279FB92.5050501@gmail.com>
	<FFF043AA-1198-49A9-91E5-76128BC98933@mac.com>
Message-ID: <b121fc93355fcab95dce724ea9796fd8@xs4all.nl>

Ronald Oussoren wrote:
> What's bothering me about the proposed semantics is that block
> statement behaves like a loop while most use cases do no looping 
> whatsoever.
> Furthermore the it doesn't feel like loop either. In all three 
> examples on this page I'd assume
> that the break would break out of the for loop.

I'm bothered the same way.
IMHO control constructs should be very clear. No implicit looping, 
conditionals etc.
Especially since the main reason to have this whole discussion is about 
resource management.
The main pattern of use I have in mind is:

resource = grab/allocate/open/whatever(...)
try:
	do something possibly with the resource
except ...:
	...
finally:
	...
	resource.release/deallocate/close/whatever()

This is linear. No looping whatsoever. And easily translated to a 
simple language construct and a protocol:

class resource(object):
	def __init__(self,...):
		# store resource parameters
	def __acquire__(self):
		# whatever it takes to grab the resource
	def __release__(self):
		# free the resource

res = resource(...)
acquire res:
	do something possibly with the resource
except ...:
	...
finally:
	...

The resource is automagically released at the end of the 'acquire' 
block (keyword up for other proposals :-)
An alternative syntax could also be allowed:

acquire resource(...) as res:
	...etc...

Then 'res' would be undefined after the 'acquire' block.

--eric


From ncoghlan at iinet.net.au  Thu May  5 17:03:54 2005
From: ncoghlan at iinet.net.au (Nick Coghlan)
Date: Fri, 06 May 2005 01:03:54 +1000
Subject: [Python-Dev] PEP 340: Non-looping version (aka PEP 310 redux)
Message-ID: <427A35DA.8050505@iinet.net.au>

The discussion on the meaning of break when nesting a PEP 340 block statement 
inside a for loop has given me some real reasons to prefer PEP 310's single pass 
semantics for user defined statements (more on that at the end). The suggestion 
below is my latest attempt at combining the ideas of the two PEP's.

For the keyword, I've used the abbreviation 'stmt' (for statement). I find it 
reads pretty well, and the fact that it *isn't* a real word makes it easier for 
me to track to the next item on the line to find out the actual statement name 
(I think this might be similar to the effect of 'def' not being a complete word 
making it easier for me to pick out the function name). I consequently use 'user 
statement' or 'user defined statement' to describe what PEP 340 calls anonymous 
block statements.

I'm still fine with the concept of not using a keyword at all, though.

Cheers,
Nick.

== User Defined Statement Usage Syntax ==

   stmt EXPR1 [as VAR1]:
       BLOCK1


== User Defined Statement Semantics ==

   the_stmt = EXPR1
   terminated = False
   try:
       stmt_enter = the_stmt.__enter__
       stmt_exit = the_stmt.__exit__
   except AttributeError:
       raise TypeError("User statement required")
   try:
       VAR1 = stmt_enter() # Omit 'VAR1 =' if no 'as' clause
   except TerminateBlock:
       pass
       # Block is not entered at all in this case
       # If an else clause were to be permitted, the
       # associated block would be executed here
   else:
       try:
           try:
               BLOCK1
           except:
               exc = sys.exc_info()
               terminated = True
               try:
                   stmt_exit(*exc)
               except TerminateBlock:
                   pass
       finally:
           if not terminated:
               try:
                   stmt_exit(TerminateBlock)
               except TerminateBlock:
                   pass

Key points:
* The supplied expression must have both __enter__ and __exit__ methods.
* The result of the __enter__ method is assigned to VAR1 if VAR1 is given.
* BLOCK1 is not executed if __enter__ raises an exception
* A new exception, TerminateBlock, is used to signal statement completion
* The __exit__ method is called with the exception tuple if an exception occurs
* Otherwise it is called with TerminateBlock as the argument
* The __exit__ method can suppress an exception by converting it to 
TerminateBlock or by returning without reraising the exception
* return, break, continue and raise StopIteration are all OK inside BLOCK1. They 
affect the surrounding scope, and are in no way tampered with by the user 
defined statement machinery (some user defined statements may choose to suppress 
the raising of StopIteration, but the basic machinery doesn't do that)
* Decouples user defined statements from yield expressions, the enhanced 
continue statement and generator finalisation.

== New Builtin: statement ==

   def statement(factory):
       try:
          factory.__enter__
          factory.__exit__
          # Supplied factory is already a user statement factory
          return factory
       except AttributeError:
          # Assume supplied factory is an iterable factory
          # Use it to create a user statement factory
          class stmt_factory(object):
              def __init__(*args, **kwds)
                  self = args[0]
                  self.itr = iter(factory(*args[1:], **kwds))
              def __enter__(self):
                  try:
                      return self.itr.next()
                  except StopIteration:
                      raise TerminateBlock
              def __exit__(self, *exc_info):
                  try:
                      stmt_exit = self.itr.__exit__
                  except AttributeError:
                      try:
                          self.itr.next()
                      except StopIteration:
                          pass
                      raise *exc_info # i.e. re-raise the supplied exception
                  else:
                      try:
                          stmt_exit(*exc_info)
                      except StopIteration:
                          raise TerminateBlock

Key points:
* The supplied factory is returned unchanged if it supports the statement API 
(such as a class with both __enter__ and __exit__ methods)
* An iterable factory (such as a generator, or class with an __iter__ method) is 
converted to a block statement factory
* Either way, the result is a callable whose results can be used as EXPR1 in a 
user defined statement.
* For statements constructed from iterators, the iterator's next() method is 
called once when entering the statement, and the result is assigned to VAR1
* If the iterator has an __exit__ method, it is invoked when the statement is 
exited. The __exit__ method is passed the exception information (which may 
indicate that no exception occurred).
* If the iterator does not have an __exit__ method, it's next() method is 
invoked a second time instead
* When an iterator is used to drive a user defined statement, StopIteration is 
translated to TerminateBlock
* Main intended use is as a generator decorator
* Decouples user defined statements from yield expressions, the enhanced 
continue statement and generator finalisation.

== Justification for non-looping semantics ==

For most use cases, the effect PEP 340 block statements have on break and 
continue statements is both surprising and undesirable. This is highlighted by 
the major semantic difference between the following two cases:

   stmt locking(lock):
       for item in items:
           if handle(item):
               break

   for item in items:
       stmt locking(lock):
           if handle(item):
               break

Instead of simply acquiring and releasing the lock on each iteration, as one 
would legitimately expect, the latter piece of code actually processes all of 
the items, instead of breaking out of the loop once one of the items is handled. 
With non-looping user defined statements, the above code works in the obvious 
fashion (the break statement ends the for loop, not the lock acquisition).

With non-looping semantics, the implementation of the examples in PEP 340 is 
essentially identical - just add an invocation of @statement to the start of the 
generators. It also becomes significantly easier to write user defined 
statements manually as there is no need to track state:

   class locking:
       def __init__(self, lock):
           self.lock = lock
       def __enter__(self):
           self.lock.acquire()
       def __exit__(self, exc_type, value=None, traceback=None):
           self.lock.release()
           if type is not None:
               raise exc_type, value, traceback

The one identified use case for a user-defined loop was PJE's auto_retry. We 
already have user-defined loops in the form of custom iterators, and there is 
nothing stopping an iterator from returning user defined statements like this:

   for attempt in auto_retry(3, IOError):
       stmt attempt:
           # Do something!
           # Including break to give up early
           # Or continue to try again without raising IOError

The implementation of auto-retry is messier than it is with all user defined 
statement being loops, but I think the benefits of non-looping semantics justify 
that sacrifice. Besides, it really isn't all that bad:

   class auto_retry(3, IOError):
       def __init__(self, times, exc=Exception):
           self.times = xrange(times-1)
           self.exc = exc
           self.succeeded = False

       def __iter__(self):
           attempt = self.attempt
           for i in self.times:
               yield attempt()
               if self.succeeded:
                   break
           else:
               yield self.last_attempt()

       @statement
       def attempt(self):
           try:
               yield None
               self.succeeded = True
           except self.exc:
               pass

       @statement
       def last_attempt(self):
           yield None

(Third time lucky! One day I'll remember that Python has these things called 
classes designed to elegantly share state between a collection of related 
functions and generators. . .)

The above code for auto_retry assumes that generators supply an __exit__ method 
as described in PEP 340 - without that, auto_retry.attempt would need to be 
written as a class since it needs to know if an exception was thrown or not:

   class auto_retry(3, IOError):
       def __init__(self, times, exc=Exception):
           self.times = xrange(times-1)
           self.exc = exc
           self.succeeded = False

       def __iter__(self):
           attempt = self.attempt
           for i in self.times:
               yield attempt(self)
               if self.succeeded:
                   break
           else:
               yield self.last_attempt()

       class attempt(object):
           def __init__(self, outer):
               self.outer = outer
           def __enter__(self):
               pass
           def __exit__(self, exc_type, value=None, traceback=None):
               if exc_type is None:
                   self.outer.succeeded = true
               elif exc_type not in self.outer.exc
                   raise exc_type, value, traceback

       @statement
       def last_attempt(self):
           yield None

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From steven.bethard at gmail.com  Thu May  5 17:05:44 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Thu, 5 May 2005 09:05:44 -0600
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <4279F909.6030206@gmail.com>
References: <20050503150510.GA13595@onegeek.org>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<4278D7D7.2040805@gmail.com>
	<d11dcfba050504083551bb0a1e@mail.gmail.com>
	<4279F909.6030206@gmail.com>
Message-ID: <d11dcfba050505080585fa710@mail.gmail.com>

On 5/5/05, Nick Coghlan <ncoghlan at gmail.com> wrote:
> Steven Bethard wrote:
> > Makes me wonder if we shouldn't just return to the __enter__() and
> > __exit__() names of PEP 310[1] where for a generator __enter__() is
> > just an alias for next().  We could even require Phillip J. Eby's
> > "blockgenerator" decorator to rename next() to __enter__(), and add
> > the appropriate __exit__() method.
> 
> You must be reading my mind or something. . .
> 
> Unless there is something in today's 80-odd messages to make it redundant, look
> for a post entitled something like "Minimalist PEP 340 (aka PEP 310 redux)"

Yeah, I should have linked to that discussion [1].  I wonder if it
would be possible to update PEP 310 with your ideas, or perhaps start
a new PEP?  I'd like to see a competitor for PEP 340 that addresses
some of the issues that came up, e.g. that the block-statement doesn't
look like a loop, so break and continue might look like they break out
of an enclosing loop.  It might also be a good place to mirror Guido's
PEP 340 examples with PEP 310-style examples -- I know the first
attempts at writing some of them weren't as clean as the later
attempts, so it would be nice to have somewhere to look for the
"current version" of everything.

STeVe

[1]http://mail.python.org/pipermail/python-dev/2005-April/053039.html
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From eric.nieuwland at xs4all.nl  Thu May  5 17:07:21 2005
From: eric.nieuwland at xs4all.nl (Eric Nieuwland)
Date: Thu, 5 May 2005 17:07:21 +0200
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <d59vll$4qf$1@sea.gmane.org>
References: <d59vll$4qf$1@sea.gmane.org>
Message-ID: <9768a4652f7ffa1778560b3548609827@xs4all.nl>

Reinhold Birkenfeld wrote:
> Changes to the grammar
>
>     The grammar for the try statement, which is currently
>
>         try_stmt: ('try' ':' suite (except_clause ':' suite)+
>                    ['else' ':' suite] | 'try' ':' suite 'finally' ':' 
> suite)
>
>     would have to become
>
>         try_stmt: ('try' ':' suite (except_clause ':' suite)+
>                    ['else' ':' suite] ['finally' ':' suite] |
>                    'try' ':' suite (except_clause ':' suite)*
>                    ['else' ':' suite] 'finally' ':' suite)

Wouldn't it be easier to change it to:

         try_stmt: ('try' ':' suite (except_clause ':' suite)*
                    ['else' ':' suite] ['finally' ':' suite] )
?

--eric


From cpr at emsoftware.com  Thu May  5 16:55:18 2005
From: cpr at emsoftware.com (Chris Ryland)
Date: Thu, 5 May 2005 14:55:18 +0000 (UTC)
Subject: [Python-Dev] PEP 340 keyword: after
References: <427A185A.90504@v.loewis.de>
	<20050505102339.7b745670@localhost.localdomain>
Message-ID: <loom.20050505T164909-961@post.gmane.org>

Rodrigo Dias Arruda Senra <rodsenra <at> gpr.com.br> writes:

>  The code pattern that will 'wrap' the block might
>  not always make sense with the chosen keyword, if
>  that keyword is not semantically neutral.
>  (not time-related, not function-related, etc).
> 
>  Notice that is _no keyword_ is chosen, nothing prevents us
>  from using (even if by aliasing):
> 
>    after_opening("/etc/passwd") as f:
>        for line in f:
>            print line.rstrip()
> 
>    after_locking(myLock):
>        # code that needs to hold the lock

I hate to add to what could be an endless discussion, but... ;-)

In this case, "while" is the better time-related prefix, whether
keyword (hopeless, due to ages-old boolean-controlled loop association)
or function, since you want to imply that the code block is going
on *while* the lock is held or *while* the file is open (and you also
want to imply that afterwards, something else happens, i.e., cleanup).

while_locked(myLock):
    # code that needs to hold the lock

--Chris Ryland, Em Software



From ncoghlan at gmail.com  Thu May  5 17:19:13 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 06 May 2005 01:19:13 +1000
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <9768a4652f7ffa1778560b3548609827@xs4all.nl>
References: <d59vll$4qf$1@sea.gmane.org>
	<9768a4652f7ffa1778560b3548609827@xs4all.nl>
Message-ID: <427A3971.8030400@gmail.com>

Eric Nieuwland wrote:
> Wouldn't it be easier to change it to:
> 
>          try_stmt: ('try' ':' suite (except_clause ':' suite)*
>                     ['else' ':' suite] ['finally' ':' suite] )
> ?

What does a try statement with neither an except clause nor a finally clause mean?

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From mwh at python.net  Thu May  5 17:25:03 2005
From: mwh at python.net (Michael Hudson)
Date: Thu, 05 May 2005 16:25:03 +0100
Subject: [Python-Dev] PEP 340 keyword: after
In-Reply-To: <loom.20050505T164909-961@post.gmane.org> (Chris Ryland's
	message of "Thu, 5 May 2005 14:55:18 +0000 (UTC)")
References: <427A185A.90504@v.loewis.de>
	<20050505102339.7b745670@localhost.localdomain>
	<loom.20050505T164909-961@post.gmane.org>
Message-ID: <2mvf5xwv8w.fsf@starship.python.net>

Chris Ryland <cpr at emsoftware.com> writes:

> In this case, "while" is the better time-related prefix, whether

Indeed.

while_execution_is_lexically_in_the_next_block lock(theLock):
   ...

Anyone?  <wink>.

Cheers,
mwh

-- 
  Every day I send overnight packages filled with rabid weasels to
  people who use frames for no good reason.
                             -- The Usenet Oracle, Oracularity #1017-1

From p.f.moore at gmail.com  Thu May  5 18:01:28 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Thu, 5 May 2005 17:01:28 +0100
Subject: [Python-Dev] PEP 340 - Remaining issues
Message-ID: <79990c6b0505050901fe38af1@mail.gmail.com>

On 5/5/05, Steven Bethard <steven.bethard at gmail.com> wrote:
> I wonder if it would be possible to update PEP 310 with your ideas,
> or perhaps start a new PEP?  I'd like to see a competitor for PEP 340 that
> addresses some of the issues that came up, e.g. that the block-statement
> doesn't look like a loop, so break and continue might look like they break
> out of an enclosing loop.

In an attempt to bring things back under control, can I summarise what
I believe are the outstanding issues?

1. Choice (or not) of a keyword. I honestly believe that there will
never be a consensus on this, and we'd be better deferring the
decision to Guido's judgement.

2. Separate protocol or not? I'm not entirely sure I have a view on
this, but it feels related to the looping question below. I do like
being able to write these things as generators, and I don't mind
needing a decorator (although I, personally, don't feel a compelling
*need* for one).

3. Looping blocks, and the break issue. I see a consensus forming here
that blocks should *not* loop. No-one has come up with a strong case
for looping blocks, except the auto_retry example, and Nick (I think
it was Nick Coghlan, sorry if my memory is wrong) demonstrated how to
build this case from a for loop and a non-looping block.

Given that Guido has stated that he is willing to accept a consensus
decision on changes to the PEP, can I suggest that rather than writing
a competitor, someone (who understands the technicalities better than
me) simply propose a modification to PEP 340 that does not loop[1].

I think the separate protocol issue is subtler - maybe it's just a
case of renaming some methods and specifying a decorator, but I really
don't understand the issues at this depth.

I apologise if this post (1) misrepresents anyone's view, or (2)
hinders things rather than helping. But I feel that we are pretty
close to a solution here, and I fear that more competing PEPs will
simply muddy the waters.

Paul.

[1] My simplistic view is that you may be able to get away with
changing the specification of the anonymous blok statement's expansion
just to remove the "while True". There's some fixup needed to avoid
the one "break" in the expansion, and probably a lot of little details
that make this far harder than I'm assuming - but maybe that's the
starting point...

From ncoghlan at gmail.com  Thu May  5 18:04:50 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 06 May 2005 02:04:50 +1000
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <d11dcfba050505080585fa710@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org>	<17015.39213.522060.873605@montanaro.dyndns.org>	<ca471dc20505031013287a2e92@mail.gmail.com>	<17015.48830.223391.390538@montanaro.dyndns.org>	<ca471dc2050503113127f938b0@mail.gmail.com>	<d58tkb$fvp$1@sea.gmane.org>	<79990c6b050504015762d004ac@mail.gmail.com>	<4278D7D7.2040805@gmail.com>	<d11dcfba050504083551bb0a1e@mail.gmail.com>	<4279F909.6030206@gmail.com>
	<d11dcfba050505080585fa710@mail.gmail.com>
Message-ID: <427A4422.4@gmail.com>

Steven Bethard wrote:
> I wonder if it
> would be possible to update PEP 310 with your ideas, or perhaps start
> a new PEP?  I'd like to see a competitor for PEP 340 that addresses
> some of the issues that came up, e.g. that the block-statement doesn't
> look like a loop, so break and continue might look like they break out
> of an enclosing loop.  It might also be a good place to mirror Guido's
> PEP 340 examples with PEP 310-style examples -- I know the first
> attempts at writing some of them weren't as clean as the later
> attempts, so it would be nice to have somewhere to look for the
> "current version" of everything.

Well, Michael Hudson and Paul Moore are the current authors of PEP 310, so 
updating it with any of my ideas would be their call.

Either way, my latest and greatest version of the non-looping block statement 
semantics can be found here:
http://mail.python.org/pipermail/python-dev/2005-May/053400.html

Some key advantages of that proposal are:
   1. It's not a loop, so nesting it inside another loop 'just works'
   2. Manual protocol implementations are _significantly_ easier to write
   3. try/finally can be done with generators _without_ changing generators
   4. try/except/else can be done with generators if they provide an __exit__ 
method that raises the exception at the point of the last yield
   5. Clearly distinct construct, no potential for confusion with for loops
   6. Generators must be clearly marked as creating a user defined statement 
(although this could be changed by giving them an __enter__ method and an 
__exit__ method)

The one downside relative to PEP 340 is that looping constructs like auto_retry 
are slightly harder to write, albeit not hugely so (once you remember that an 
iterable can be a class instead of a generator!). On the usage front, I find the 
'loop over an iterator returning user defined statements' does a much better job 
of making the iteration clear, so I'd be inclined to count that as an advantage 
of a PEP 310 style approach.

Anyway, I've already been spending more time on this than I should (sleep is 
optional, right?), so I won't be prettying it up into PEP format any time soon. 
I have no objection to someone else rolling some of the ideas into a PEP, though :)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From cgarciaf at lucent.com  Thu May  5 18:05:38 2005
From: cgarciaf at lucent.com (Carlos Garcia)
Date: Thu, 5 May 2005 18:05:38 +0200
Subject: [Python-Dev] problems with memory management
Message-ID: <007501c5518c$477d9150$ba565887@1068801y07c0j>

Hi All,


    I do hava a problem with python and it is that it raise an outofmemory (i comment lines in Py.java to avoid system.exit, to debug),
i try to debug this issue with jprobe and realize that i get the exception even although the java heap is not in the limit, i book
64- 256M and the java heap was less than 60 M.

    The program is a command line that receive a line that python parse an call some java classes ti execute the appropiate command, 
any idea?

Thansk,

    
==========================================================
 Carlos Garc?a                    Phone  : +34 91 714 8796
 Lucent Technologies          e-mail : cgarciaf at lucent.com
 Avenida de Bruselas , 8   -  28108 Alcobendas (Madrid)
==========================================================
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.python.org/pipermail/python-dev/attachments/20050505/cfd99ea1/attachment.htm

From rrr at ronadam.com  Thu May  5 18:27:20 2005
From: rrr at ronadam.com (Ron Adam)
Date: Thu, 05 May 2005 12:27:20 -0400
Subject: [Python-Dev] PEP 340 keyword: Extended while syntax
In-Reply-To: <427A185A.90504@v.loewis.de>
References: <427A185A.90504@v.loewis.de>
Message-ID: <427A4968.5080308@ronadam.com>


I expect there's an obvious reason why this hasn't been suggested 
already that I'm not currently thinking of, but here it is anyway.  :-)


How about an *extended while* syntax as a block keyword alternative?

Reasoning: The block statement resembles a "while" block in some ways in 
that it is a conditional block that may be executed only once, or 
possibly not at all (or many times).  And the word "while" is also 
descriptive of how a block is used.

     while VAR1 from EXPR1():
         BLOCK

This will require a new keyword/operator 'from' to use in a 'from' 
expression:

    VAR1 from EXPR1()

Where EXPR1 returns an anonymous iterator, and the expression (VAR1 from 
EXPR1()) evaluates as True only if a value from the EXPR1 iterator is 
received.  Or possibly False if it is received and is None. [* see below]

The "for" tests for the name binding instead of testing the value of 
VAR1, it may also be desirable to check VAR1 for None after it is recieved.

This would be translated as follows:

     1 --> while VAR1 from EXPR1():
	  raise an error if EXPR1 is not an iterator.

     2 --> while (VAR1 = _EXPR1_iter.__next__()):   # internal

     3 --> while True:     # if VAR1 gets a new value
     or 3 -> while False:  # if VAR1 fails to get a value
     [*]or 3 -> while False:  # if VAR1 receives None

* Undecided on check for None. An iterator could always return 
something, so testing for None would be needed; or it could refuse and 
break the request somehow after it is called. In the later case None 
could be a valid return value it may not desirable to finalize the 
block. A while *might* be able to test for both.

     while VAR1 from EXPR1() and VAR1!=None:

or ...

     while VAR1 from EXPR1() and VAR1:

Order of placement could make a difference.

     while VAR1 and VAR1 from EXPR1():

This would test the *last* VAR1 before getting a new value. That might 
be useful in some situations. This may also be inconsistent with how 
expressions are currently evaluated.  I'm not sure if it's allowed for 
names to rebound while evaluating an expression.


Examples:

     while lock from locking(myLock):
         # Code here executes with myLock held.

     while f from opening("/etc/passwd"):
         for line in f:
             print line.rstrip()

     while retry from auto_retry(3, IOError):
         f = urllib.urlopen("http://python.org/peps/pep-0340.html")
         print f.read()

     while f from locking_opening(myLock, "/etc/passwd"):
         for line in f:
             print line.rstrip()

     while f from opening(filename, "w"):
         while re_out from redirecting_stdout(f):
             print "Hello world"

     while f, err from opening_w_error("/etc/passwd", "a"):
         if err:
             print "IOError:", err
         else:
             f.write("guido::0:0::/:/bin/sh\n")


Because the *from expression* evaluates to a bool, it might be useful in 
other places, although there may be reason to prevent it from being used 
as such.

     if VAR1 from GEN:
         print VAR1
     else:
         print "GEN didn't give me anything"

Another possibility is the use of xrange() in a block statements/ or 
extended while statements.

     while VAR1 from xrange(100):
         block

This may blur the distinction between "for" loops and "while" loops, 
although it may be a *good* thing since "for" can then always used 
sequences, and the *extended while syntax* always use iterators.  Which 
to use, would be up to the programmer.

With that change xrange() support could be removed from "for" statements 
in Python 3000, (I think Guido wants to do that.), and it then could be 
used with "extended while" statements.

With this suggestion there will still only be two looping constructs, 
"for" and "while", and I think the distinction between a normal "while" 
and an extended "while" is made clear with the "from" keyword.  I think 
this would be much easier to understand, IMO, and also much easier to 
read and teach as well.  It uses already familiar syntax and adds a new 
expression keyword instead of a new statement keyword.

A symbol might be possible instead of "from", so adding new keywords 
could be avoided if "from" is out of the question.


Optimistically,
Ron_Adam



From ncoghlan at gmail.com  Thu May  5 18:33:45 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 06 May 2005 02:33:45 +1000
Subject: [Python-Dev] PEP 340 - Remaining issues
In-Reply-To: <79990c6b0505050901fe38af1@mail.gmail.com>
References: <79990c6b0505050901fe38af1@mail.gmail.com>
Message-ID: <427A4AE9.80007@gmail.com>

Paul Moore wrote:
> 1. Choice (or not) of a keyword. I honestly believe that there will
> never be a consensus on this, and we'd be better deferring the
> decision to Guido's judgement.

The keyword-less approach is less confusing when the block statement is not a 
loop, as that eliminates the suprising behaviour of break and continue statements.

If there is a keyword, the wide variety of user-defined statements means that 
any real English word will be a bad fit for at least some of them. Something 
relatively nonsensical, but usefully mnemonic (like 'stmt') may be a good way to go.

> 2. Separate protocol or not? I'm not entirely sure I have a view on
> this, but it feels related to the looping question below. I do like
> being able to write these things as generators, and I don't mind
> needing a decorator (although I, personally, don't feel a compelling
> *need* for one).

If the block statement doesn't loop, the PEP 310 protocol makes a lot more 
sense. A function (usable as a generator decorator) can then be provided to 
convert from a callable that returns iterables to a callable that returns 
objects that support the PEP 310 protocol.

> Given that Guido has stated that he is willing to accept a consensus
> decision on changes to the PEP, can I suggest that rather than writing
> a competitor, someone (who understands the technicalities better than
> me) simply propose a modification to PEP 340 that does not loop

My attempt at doing exactly that is "PEP 340: Non-looping version (aka PEP 310 
redux)" [1]

And the seemingly simple change ('run the block at most once') had far more 
wide-ranging ramifications than I expected.

> I think the separate protocol issue is subtler - maybe it's just a
> case of renaming some methods and specifying a decorator, but I really
> don't understand the issues at this depth.

When I was writing my suggested semantics for a non-looping version, the use of 
an iteration protocol (next, StopIteration) became obviously inappropriate. So 
while having a separate protocol is a little murky when block statements are 
loops, the PEP 310 interface protocol is a clear winner when block statements 
are _not_ loops.

> I apologise if this post (1) misrepresents anyone's view, or (2)
> hinders things rather than helping. But I feel that we are pretty
> close to a solution here, and I fear that more competing PEPs will
> simply muddy the waters.

In this case, I think having a separate document (perhaps PEP 310, or maybe a 
Wiki page) to describe how a non-looping block statement can support all of the 
identified use cases for the PEP 340's block statement will be clearer than 
trying to describe the two main alternatives in the same PEP.

Cheers,
Nick.

[1] http://mail.python.org/pipermail/python-dev/2005-May/053400.html

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.skystorm.net

From rrr at ronadam.com  Thu May  5 18:48:29 2005
From: rrr at ronadam.com (Ron Adam)
Date: Thu, 05 May 2005 12:48:29 -0400
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <b121fc93355fcab95dce724ea9796fd8@xs4all.nl>
References: <20050503150510.GA13595@onegeek.org>
	<427797D5.8030207@cirad.fr>	<17015.39213.522060.873605@montanaro.dyndns.org>	<ca471dc20505031013287a2e92@mail.gmail.com>	<17015.48830.223391.390538@montanaro.dyndns.org>	<ca471dc2050503113127f938b0@mail.gmail.com>	<d58tkb$fvp$1@sea.gmane.org>	<79990c6b050504015762d004ac@mail.gmail.com>	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>	<4279FB92.5050501@gmail.com>	<FFF043AA-1198-49A9-91E5-76128BC98933@mac.com>
	<b121fc93355fcab95dce724ea9796fd8@xs4all.nl>
Message-ID: <427A4E5D.1080904@ronadam.com>

Eric Nieuwland wrote:

> This is linear. No looping whatsoever. And easily translated to a 
> simple language construct and a protocol:
> 
> class resource(object):
> 	def __init__(self,...):
> 		# store resource parameters
> 	def __acquire__(self):
> 		# whatever it takes to grab the resource
> 	def __release__(self):
> 		# free the resource


I wanted to see what the examples in PEP340 would look like written with 
standard class's using object inheritance and overriding to define
resource managers.  If anyone's interested I can post it.

My block class is non-looping as I found in most cases looping isn't 
required, and looping complicates things because you have to pass around 
a loop expression due to not all loops will want to behave that same way.

The solution was to put the loop in the body method and call a 
repeat_body method (the repeated body section) which is added to the 
class when needed.

Ron_Adam



From tjreedy at udel.edu  Thu May  5 19:17:16 2005
From: tjreedy at udel.edu (Terry Reedy)
Date: Thu, 5 May 2005 13:17:16 -0400
Subject: [Python-Dev] problems with memory management
References: <007501c5518c$477d9150$ba565887@1068801y07c0j>
Message-ID: <d5dk2a$l99$1@sea.gmane.org>


>    I do hava a problem with python and it is that it raise an outofmemory 
>  >(i comment lines in Py.java to avoid system.exit, to debug),


Questions about using current Python belong on the Python list or 
comp.lang.python.  Python-dev is for discussions about developing future 
versions. 




From steven.bethard at gmail.com  Thu May  5 19:23:20 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Thu, 5 May 2005 11:23:20 -0600
Subject: [Python-Dev] PEP 340: Non-looping version (aka PEP 310 redux)
In-Reply-To: <427A35DA.8050505@iinet.net.au>
References: <427A35DA.8050505@iinet.net.au>
Message-ID: <d11dcfba050505102326b6b7ee@mail.gmail.com>

On 5/5/05, Nick Coghlan <ncoghlan at iinet.net.au> wrote:
> The discussion on the meaning of break when nesting a PEP 340 block statement
> inside a for loop has given me some real reasons to prefer PEP 310's single pass
> semantics for user defined statements (more on that at the end). The suggestion
> below is my latest attempt at combining the ideas of the two PEP's.
>
[snip]
> * An iterable factory (such as a generator, or class with an __iter__ method) is
> converted to a block statement factory

I like the non-looping proposal a lot, but I'd still prefer that
iterators were not usable as statements.  As I understand it, the main
motivation for wanting iterators to be usable as statements is that
generators provide a very simple way of creating iterators, and we'd
like to have an equally simple way of creating statments.  The
simplicity of generators is that using a "yield" statement inside a
"def" statement magically modifies the function so that it returns
"iterator" objects.  I'd like to see a parallel for block-statements,
so that using an "XXX" statement inside a "def" statement magically
modifies the function so that it returns "statement" objects.

To illustrate my point, I'm going to assume a no-keyword syntax for
calling statement objects and I'm going to steal your "stmt" keyword
to replace "yield". So, for example, to create the "opening" statement
from PEP 340, you would write it almost exactly the same:

    def opening(filename, mode="r"):
        f = open(filename, mode)
        try:
            stmt f
        finally:
            f.close()

This would create a generator-like object that instead of providing
__iter__() and next() methods, provides __enter__() and __exit__()
methods.  It could then be called like:

    opening("temp.txt") as f:
        for line in f:
            print line

I like this for a few reasons:
* statement-generators (or whatever you want to call them) are just as
easy to declare as normal generators are
* try/finally statements around a "yield" would still be invalid
syntax, as generators can't generally guarantee proper finalization
semantics.
* statement objects can't be accidentally used in for-loops; they
don't have __iter__() or next() methods
* statement objects can be clearly documented separately from iterator
objects; there would be no need for them to refer to each other

I don't know the generator implementation very well, but I would think
that statement-generators could share almost all the code of normal
generators by simply changing the name of a slot or two and adding the
__exit__ code already proposed by PEP 340.

STeVe
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From gustavo at niemeyer.net  Thu May  5 19:36:52 2005
From: gustavo at niemeyer.net (Gustavo Niemeyer)
Date: Thu, 5 May 2005 14:36:52 -0300
Subject: [Python-Dev] PEP 340 keyword: Extended while syntax
In-Reply-To: <427A4968.5080308@ronadam.com>
References: <427A185A.90504@v.loewis.de> <427A4968.5080308@ronadam.com>
Message-ID: <20050505173652.GA6947@burma.localdomain>

Greetings,

> Reasoning: The block statement resembles a "while" block in some ways in 
> that it is a conditional block that may be executed only once, or 
> possibly not at all (or many times).  And the word "while" is also 
> descriptive of how a block is used.
> 
>      while VAR1 from EXPR1():
>          BLOCK

This is an interesting propose, but for a different PEP.  In the
current propose VAR1 is not evaluated for truthness, and many of
the usage examples doesn't even require it.

This looks quite strange, for instance:

   while dummy from locking(myLock):
      # Do something

And also, this would require a break necessarily:

   while (foo, bar) from locking():
      # Pass

> This will require a new keyword/operator 'from' to use in a 'from' 
> expression:

'from' is already a keyword, btw.

-- 
Gustavo Niemeyer
http://niemeyer.net

From jcarlson at uci.edu  Thu May  5 20:08:35 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Thu, 05 May 2005 11:08:35 -0700
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <Pine.LNX.4.58.0505050429200.4786@server1.LFW.org>
References: <20050505004244.64C0.JCARLSON@uci.edu>
	<Pine.LNX.4.58.0505050429200.4786@server1.LFW.org>
Message-ID: <20050505095935.64C6.JCARLSON@uci.edu>


Ka-Ping Yee <yahoo at zesty.ca> wrote:
> 
> On Thu, 5 May 2005, Josiah Carlson wrote:
> > Ka-Ping Yee <python-dev at zesty.ca> wrote:
> > >     continue with 2
> >
> > There is something about <action> <which> <level> that I just don't like.
> 
> Just to clarify: if by <level> you mean "nesting level", did it appear
> that the 2 in my example was a count of block levels?  I didn't
> mean it that way -- the 2 is a value passed in to the generator
> that appears as the value of the "yield" expression, as in PEP 340.

I remember reading that, but I seem to have forgotten it when I was
composing my reply.  Thankfully, sleeping on it has helped me discover
what I really don't like.  With the 'passing value' semantic, the
<action> [ <which> ] [ <value> ] is only useful for the deepest loop of
a particular type. Take for example...

for ...:
    for ...:
        for ...:
            break/continue [for]


That break or continue can only affect that last for loop.  It doesn't
make any easier the use of nested fors, nested whiles, or even nested
blocks.  It only really helps you if you mix and match all possible
looping constructs, and even then, only gives the granularity of the
most recent block of a particular type.  In that sense, I think it is a
nonstarter, because it doesn't really add functionality in common uses
of for and while statements.

If one allowed <action> [<which>] [<value>] , [<level>], then one could
jump to arbitrary loops.  Now, I'm not condoning this, and I don't even
like it.  Sure, it allows breaking or continuing to any for, while, or
block statement in the current scope, but the <level> argument is as
equivalently ambiguous as package-relative imports using a leading
integer (http://python.org/peps/pep-0328.html).

Now, one can remove ambiguity if we were able to 'label' while loops and
for loops producing <action> [ <label> ] , [ <value> ], but at that
point we are getting into the discussion of a loop-aware goto with
loop/block cleanup, and a syntax for labeling loops.  Ick.

 - Josiah


From rrr at ronadam.com  Thu May  5 21:05:28 2005
From: rrr at ronadam.com (Ron Adam)
Date: Thu, 05 May 2005 15:05:28 -0400
Subject: [Python-Dev] PEP 340 keyword: Extended while syntax
In-Reply-To: <20050505173652.GA6947@burma.localdomain>
References: <427A185A.90504@v.loewis.de> <427A4968.5080308@ronadam.com>
	<20050505173652.GA6947@burma.localdomain>
Message-ID: <427A6E78.5000809@ronadam.com>

Gustavo Niemeyer wrote:
> Greetings,
> 
> 
>>Reasoning: The block statement resembles a "while" block in some ways in 
>>that it is a conditional block that may be executed only once, or 
>>possibly not at all (or many times).  And the word "while" is also 
>>descriptive of how a block is used.
>>
>>     while VAR1 from EXPR1():
>>         BLOCK
> 
> 
> This is an interesting propose, but for a different PEP.  

Maybe someone who is more familiar with submitting a PEP could submit it 
as a competing PEP to 340 then.

In the
> current propose VAR1 is not evaluated for truthness, and many of
> the usage examples doesn't even require it.

VAR1 isn't evaluated for truthfulness, but the expression as a whole is. 
  It just says, EXPR1 is a iterator, and VAR1 received a value from it. 
Evaluating to a bool makes it consistent with the 'while' statement 
usage and those checks are all-ready taking place in the block 
statement. Here they are explicit instead of implicit which adds to the 
readability. IMHO of course.

> This looks quite strange, for instance:
> 
>    while dummy from locking(myLock):
>       # Do something

I thought of that, but I could get use to it.  The dummy helps make it 
readable although the value may not actually be used in the block.  One 
use with locks is to return a count of the current locks.  Useful for 
monitoring what the iterator is doing.

A shorter "while locking(myLock):" could be used, and the "dummy from" 
be optional.  In that case the returned None would be discarded.

	while [NAME from] ITERATOR():

Or it could be made explicit with:

	while None == locking(myLock):

Although I suppose this would look strange to some also. In this case, 
an explicit test is being made of the returned VAR1.  Testing for other 
values could be possible.


> And also, this would require a break necessarily:
> 
>    while (foo, bar) from locking():
>       # Pass

If the iterator is written without a loop in it, it will only execute 
one yield, so the second time though it will end without returning a 
value.

def locking():
    try:
        yield lock()
    finally:
        release lock()

This will execute once in an extended while.

def locking():
    try:
        while True:
           yield new_lock()
    finally:
        release_all_locks()

This would need to be broken out of.

>>This will require a new keyword/operator 'from' to use in a 'from' 
>>expression:
> 
> 
> 'from' is already a keyword, btw.

Oh, um... need more sleep. ;-)

So no new keywords would be needed in this example, just an alternate 
use of an existing keyword.  Replace the above with...

 >>This will require *no* new keyword, the keyword/operator 'from' will 
have a new use in an extended while expression:

Since I tend to put imports at the top of my programs and never see 
'from' anywhere else, it didn't ring a bell.

Any reason why they both couldn't work?


Cheers, Ron_Adam



From python-dev at zesty.ca  Thu May  5 21:13:31 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Thu, 5 May 2005 14:13:31 -0500 (CDT)
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <427A4422.4@gmail.com>
References: <20050503150510.GA13595@onegeek.org>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<4278D7D7.2040805@gmail.com>
	<d11dcfba050504083551bb0a1e@mail.gmail.com>
	<4279F909.6030206@gmail.com>
	<d11dcfba050505080585fa710@mail.gmail.com> <427A4422.4@gmail.com>
Message-ID: <Pine.LNX.4.58.0505051411080.4786@server1.LFW.org>

On Fri, 6 May 2005, Nick Coghlan wrote:
> Either way, my latest and greatest version of the non-looping block statement
> semantics can be found here:
> http://mail.python.org/pipermail/python-dev/2005-May/053400.html

My mind is not made up, but i did find this proposal pretty appealing.
I'd love to see it described in a complete PEP as that would make it
easier to compare directly -- in particular, comparing a list of use
cases to the one in PEP 340 would be very valuable to this discussion.


-- ?!ng

From nbastin at opnet.com  Thu May  5 21:51:47 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Thu, 5 May 2005 15:51:47 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427946B9.6070500@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>
	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>
	<427946B9.6070500@v.loewis.de>
Message-ID: <ed6dac3aa136985e60159713ff1d75ac@opnet.com>


On May 4, 2005, at 6:03 PM, Martin v. L?wis wrote:

> Nicholas Bastin wrote:
>> "This type represents the storage type which is used by Python
>> internally as the basis for holding Unicode ordinals.  Extension 
>> module
>> developers should make no assumptions about the size of this type on
>> any given platform."
>
> But people want to know "Is Python's Unicode 16-bit or 32-bit?"
> So the documentation should explicitly say "it depends".

The important piece of information is that it is not guaranteed to be a 
particular one of those sizes.  Once you can't guarantee the size, no 
one really cares what size it is.  The documentation should discourage 
developers from attempting to manipulate Py_UNICODE directly, which, 
other than trivia, is the only reason why someone would care what size 
the internal representation is.

--
Nick

From p.f.moore at gmail.com  Thu May  5 21:58:12 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Thu, 5 May 2005 20:58:12 +0100
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <427A4422.4@gmail.com>
References: <20050503150510.GA13595@onegeek.org>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<4278D7D7.2040805@gmail.com>
	<d11dcfba050504083551bb0a1e@mail.gmail.com>
	<4279F909.6030206@gmail.com>
	<d11dcfba050505080585fa710@mail.gmail.com> <427A4422.4@gmail.com>
Message-ID: <79990c6b05050512582e52a7af@mail.gmail.com>

On 5/5/05, Nick Coghlan <ncoghlan at gmail.com> wrote:
> Well, Michael Hudson and Paul Moore are the current authors of PEP 310, so
> updating it with any of my ideas would be their call.

I'm willing to consider an update - I don't know Michael's view. I
currently find myself in the odd situation of defending PEP 340
against PEP 310, though. I'll try to rationalise why below...

> Either way, my latest and greatest version of the non-looping block statement
> semantics can be found here:
> http://mail.python.org/pipermail/python-dev/2005-May/053400.html

I can't reconcile this description with that of PEP 310. The basic
reason is that the styles are very different, compounded by the fact
that I don't have the time to give this the analysis it needs.

My instinct is that if your proposal can't be described in terms of a
relatively small change to one of PEP 310 or 340, then my view is it's
a 3rd proposal in its own right, and I'd rather not open things up
that much again.

> Some key advantages of that proposal are:
>    1. It's not a loop, so nesting it inside another loop 'just works'

This, to me, is the main issue

>    2. Manual protocol implementations are _significantly_ easier to write

Hmm, I've not tried so I'll have to take your word for this. But I
don't imagine writing manual implementations much - one of the key
features I like about Guido's proposal is that generators can be used,
and the implementation is a clear template, with "yield" acting as a
"put the block here" marker (yes, I know that's an
oversimplification!).

>    3. try/finally can be done with generators _without_ changing generators

How is this an advantage? PEP 340 allows try/finally inside generators
- if that counts as "changing generators" I don't see why I care (as a
user). But I think I'm missing something here - I never really
followed (or cared about) the generator finalisation issues.

>    4. try/except/else can be done with generators if they provide an __exit__
> method that raises the exception at the point of the last yield

As above - I'm not sure I follow.

>    5. Clearly distinct construct, no potential for confusion with for loops

OK, but that's really just saying "not a loop" again.

>    6. Generators must be clearly marked as creating a user defined statement
> (although this could be changed by giving them an __enter__ method and an
> __exit__ method)

I still don't see a compelling argument that this is a good thing. I'm
neutral on this.

> The one downside relative to PEP 340 is that looping constructs like auto_retry
> are slightly harder to write, albeit not hugely so (once you remember that an
> iterable can be a class instead of a generator!).

Are you *sure* that's the only downside? Early on in the PEP 340
discussion, generator finalisation was a point of discussion. I didn't
follow the details, but I believe that one of the points of PEP 340
was that it addressed the issues (to some extent - I really don't know
if the generator finalisation PEPs are rendered obsolete by PEP 340,
are completely orthogonal, or somewhere in between). I have no feel
for whether your proposal covers these issues in the same way as PEP
340 does.

And does your proposal allow for "continue EXPR" as supported by PEP
340? I can't see that it could, given that your proposal treats block
statements as not being loops. Having just noticed this, I start to
feel less convinced that block-as-loop is ultimately wrong. There
aren't any examples of continue EXPR included in PEP 340 yet - a fact
that Guido has acknowledged in item 9 of the examples section. Maybe
Philip or one of the other coroutine fans would like to contribute
some examples?

> On the usage front, I find the
> 'loop over an iterator returning user defined statements' does a much better job
> of making the iteration clear, so I'd be inclined to count that as an advantage
> of a PEP 310 style approach.

I can accept that. It's a minor style point either way. The wording of
it - as "returning user defined statements" makes me nervous though,
as it implies that "user-defined statements" are first class objects -
which feels wrong.

> Anyway, I've already been spending more time on this than I should (sleep is
> optional, right?), so I won't be prettying it up into PEP format any time soon.
> I have no objection to someone else rolling some of the ideas into a PEP, though :)

I think *someone* has to care enough (and have the time) to make a
proper PEP out of this. If it's a minor change to either of PEP 310 or
PEP 340, that's OK. If it's too big for that, it needs to be fleshed
out as a full competing PEP in its own right.

Oh - and the promised rationalisation of my preference for PEP 340
over PEP 310. Things I like about PEP 340:

  - Using generators as "templates" with yield as a "put the block
here" placeholder. It may be that a modification of PEP 310 can
provide this, but that PEP doesn't exist yet (sorry!)
  - The coroutine style "continue EXPR" feature (I still need
motivating examples but as a concept I like it).

[It's interesting that these are the two points Guido notes as
"orthogonal" concepts in PEP 340 - I don't apologise for this though,
it's the whole package that appeals to me].

The looping behaviour is a (fairly nasty) wart, but I'm not sure I
would insist on removing it at the cost of damaging other features I
like. And I'm not entirely convinced that the desired behaviour of
"break" might be achievable by special-casing break (but I don't have
the time or understanding to flesh that out, so I have to accept that
it may not happen).

Paul.

From nbastin at opnet.com  Thu May  5 21:58:17 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Thu, 5 May 2005 15:58:17 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <42794ABD.2080405@hathawaymix.org>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>
	<427946B9.6070500@v.loewis.de> <42794ABD.2080405@hathawaymix.org>
Message-ID: <94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>


On May 4, 2005, at 6:20 PM, Shane Hathaway wrote:

> Martin v. L?wis wrote:
>> Nicholas Bastin wrote:
>>
>>> "This type represents the storage type which is used by Python
>>> internally as the basis for holding Unicode ordinals.  Extension 
>>> module
>>> developers should make no assumptions about the size of this type on
>>> any given platform."
>>
>>
>> But people want to know "Is Python's Unicode 16-bit or 32-bit?"
>> So the documentation should explicitly say "it depends".
>
> On a related note, it would be help if the documentation provided a
> little more background on unicode encoding.  Specifically, that UCS-2 
> is
> not the same as UTF-16, even though they're both two bytes wide and 
> most
> of the characters are the same.  UTF-16 can encode 4 byte characters,
> while UCS-2 can't.  A Py_UNICODE is either UCS-2 or UCS-4.  It took me

I'm not sure the Python documentation is the place to teach someone 
about unicode.  The ISO 10646 pretty clearly defines UCS-2 as only 
containing characters in the BMP (plane zero).  On the other hand, I 
don't know why python lets you choose UCS-2 anyhow, since it's almost 
always not what you want.

--
Nick


From steven.bethard at gmail.com  Thu May  5 23:34:38 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Thu, 5 May 2005 15:34:38 -0600
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <79990c6b05050512582e52a7af@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<4278D7D7.2040805@gmail.com>
	<d11dcfba050504083551bb0a1e@mail.gmail.com>
	<4279F909.6030206@gmail.com>
	<d11dcfba050505080585fa710@mail.gmail.com> <427A4422.4@gmail.com>
	<79990c6b05050512582e52a7af@mail.gmail.com>
Message-ID: <d11dcfba05050514344a8f4517@mail.gmail.com>

On 5/5/05, Paul Moore <p.f.moore at gmail.com> wrote:
> And does your proposal allow for "continue EXPR" as supported by PEP
> 340? I can't see that it could, given that your proposal treats block
> statements as not being loops.

Read PEP 340 again -- the "continue EXPR" syntax is orthogonal to the
discussion -- PEP 340 adds it for *all* for loops, so for loops with
the non-looping block statements would also be able to use it.

> The looping behaviour is a (fairly nasty) wart, but I'm not sure I
> would insist on removing it at the cost of damaging other features I
> like.

I don't think it "damages" any features.  Are there features you still
think the non-looping proposal removes?  (I'm not counting orthogonal
feautres like "continue EXPR" which could easily be added as an
entirely separate PEP.)

STeVe
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From shane at hathawaymix.org  Fri May  6 01:55:23 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Thu, 05 May 2005 17:55:23 -0600
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>
	<427946B9.6070500@v.loewis.de> <42794ABD.2080405@hathawaymix.org>
	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
Message-ID: <427AB26B.2040004@hathawaymix.org>

Nicholas Bastin wrote:
> 
> On May 4, 2005, at 6:20 PM, Shane Hathaway wrote:
>> On a related note, it would be help if the documentation provided a
>> little more background on unicode encoding.  Specifically, that UCS-2 is
>> not the same as UTF-16, even though they're both two bytes wide and most
>> of the characters are the same.  UTF-16 can encode 4 byte characters,
>> while UCS-2 can't.  A Py_UNICODE is either UCS-2 or UCS-4.  It took me
> 
> 
> I'm not sure the Python documentation is the place to teach someone
> about unicode.  The ISO 10646 pretty clearly defines UCS-2 as only
> containing characters in the BMP (plane zero).  On the other hand, I
> don't know why python lets you choose UCS-2 anyhow, since it's almost
> always not what you want.

Then something in the Python docs ought to say why UCS-2 is not what you
want.  I still don't know; I've heard differing opinions on the subject.
 Some say you'll never need more than what UCS-2 provides.  Is that
incorrect?

More generally, how should a non-unicode-expert writing Python extension
code find out the minimum they need to know about unicode to use the
Python unicode API?  The API reference [1] ought to at least have a list
of background links.  I had to hunt everywhere.

.. [1] http://docs.python.org/api/unicodeObjects.html

Shane

From rrr at ronadam.com  Fri May  6 02:12:36 2005
From: rrr at ronadam.com (Ron Adam)
Date: Thu, 05 May 2005 20:12:36 -0400
Subject: [Python-Dev]  PEP 340: Examples as class's.
Message-ID: <427AB674.5010809@ronadam.com>


Eric Nieuwland wrote:

 > Ron Adam wrote:
 >
 >> Eric Nieuwland wrote:
 >>
 >>> This is linear. No looping whatsoever. And easily translated to a
 >>> simple language construct and a protocol:
 >>>
 >>> class resource(object):
 >>>     def __init__(self,...):
 >>>         # store resource parameters
 >>>     def __acquire__(self):
 >>>         # whatever it takes to grab the resource
 >>>     def __release__(self):
 >>>         # free the resource
 >>
 >>
 >>
 >>
 >> I wanted to see what the examples in PEP340 would look like written with
 >> standard class's using object inheritance and overriding to define
 >> resource managers.  If anyone's interested I can post it.
 >
 >
 > I'm interested.
 >
 >> My block class is non-looping as I found in most cases looping isn't
 >> required, and looping complicates things because you have to pass around
 >> a loop expression due to not all loops will want to behave that same 
way.
 >>
 >> The solution was to put the loop in the body method and call a
 >> repeat_body method (the repeated body section) which is added to the
 >> class when needed.
 >
 >
 > Show us your stuff! ;-)
 >
 > --eric
 >
Ok,  :)

These probably can be improved on, and renamed, and I don't know how 
many problems there may be with it, but they work in these examples. 
This is just one way to do it,  so critique, correct, and improve as 
needed.  Just no flames please.  ;-)

Cheers,  Ron_Adam


# --- start ---
## blockclass.py


##  A base class that manages resource acquire and release.
class Block(object):
    """A manager class for working with resources."""
    def __init__(self,*args):
        self._cleanup = False
        self.__err__ = None
        try:
            self.start(*args)
            self._cleanup = True
        except:
            raise
    def __call__(self, *args):
        self.block(*args)
        self.__del__()
    def __del__(self):
        if self._cleanup:
            try:
                self.final()
                self._cleanup = False
            except:
                raise
            if self.__err__:
                raise self.__err__
    def start(self,*args):
        """Override to add initialization code"""
        pass
    def final(self):
        """Override to add finalization code"""
        pass
    def block(self,*args):
        """Override to add main block body"""
        pass


## A dummy lock for test calls only
class mylock:
    def acquire(self):
        print "Lock acquired"
    def release(self):
        print "Lock released"


## 1. A template for ensuring that a lock, acquired at the start of a
##       block, is released when the block is left:
class Get_Lock(Block):
    def start(self, lock):
        self.lock = lock
        self.lock.acquire()
    def final(self):
        self.lock.release()
    def block(self):
        pass

class Lock_It(Get_Lock):
    def block(self):
        print "Do stuff while locked"
       Lock_It(mylock())()


## 2. A template for opening a file that ensures the file is closed:
class File_Opener(Block):
    def start(self, filename, mode='r'):
        self.filename = filename
        self.mode = mode
        self.f = open(filename, mode)
    def final(self):
        self.f.close()
    def block(self):
        pass

class Line_Printer(File_Opener):
    def block(self):
        n = 0
        for line in self.f:
            print n, line.rstrip()
            n += 1

Line_Printer('blockclass.py')()



### 3. A template for committing or rolling back a database:
#def transactional(db):
#    try:
#        yield
#    except:
#        db.rollback()
#        raise
#    else:
#        db.commit()

"""I'm not exactly sure how this one should work, so maybe some
one else can give an example using the block class above."""



## 4. A template that tries something up to n times:
import urllib
class Auto_Retry(Block):
    def start(self, n=3, exc=Exception):
        self.n = n
        self.exc = exc
    def block(self, *args):
        while self.n:
            try:
                self.repeat_block(*args)
                break
            except self.exc, self.__err__:
                self.n -= 1
    def repeat_block(self, *args):
        """Try this block n times"""
        pass

class Retry_Url(Auto_Retry):
    def repeat_block(self, url):
        f = urllib.urlopen(url)
        print f.read()

# This could be slow, so wait for it.
try:
    Retry_Url(3, IOError)("http://cantfind.com/this.html")
except IOError, err:
    print err

Retry_Url(3, IOError)("http://python.org/peps/pep-0340.html")



## 5. It is possible to nest blocks and combine templates:
class Lockit(Get_Lock):
    def block(self, job):
        job()

Lockit(mylock())(Line_Printer("blockclass.py"))



## 7. Redirect stdout temporarily:
import sys
class Redirect_Stdout(Block):
    def start(self, handle=sys.stdout):
        self.save_stdout = sys.stdout
        sys.stdout = handle
    def final(self):
        sys.stdout.close()
        sys.stdout = self.save_stdout
    def block(self):
        pass

class New_Out(Redirect_Stdout):
    def block(self):
        print "Writing to redirected std_out"

New_Out(open('newfile.txt','w'))()



## 8. A variant on opening() that also returns an
##    error condition:
class Append_To(File_Opener):
    def start(self, filename):
        self.f = open(filename, 'a')
    def block(self, string):
        self.f.write(string)

def append_w_error(file,sting):
    try:
        Append_To(file)(string)
    except IOError, err:
        print "IOError:", err

append_w_error("/etc/passwd", "guido::0:0::/:/bin/sh\n")

# --- end ---







From greg.ewing at canterbury.ac.nz  Fri May  6 03:55:22 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 06 May 2005 13:55:22 +1200
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <FFF043AA-1198-49A9-91E5-76128BC98933@mac.com>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
	<4279FB92.5050501@gmail.com>
	<FFF043AA-1198-49A9-91E5-76128BC98933@mac.com>
Message-ID: <427ACE8A.4040302@canterbury.ac.nz>

Seems to me it should be up to the block iterator whether
a break statement gets caught or propagated, since it's
up to the block iterator whether the construct behaves
like a loop or not.

This could be achieved by having a separate exception
for breaks, as originally proposed.

If the iterator propagates the Break exception back out,
the block statement should break any enclosing loop.
If the iterator wants to behave like a loop, it can
catch the Break exception and raise StopIteration
instead.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From tdelaney at avaya.com  Fri May  6 04:05:05 2005
From: tdelaney at avaya.com (Delaney, Timothy C (Timothy))
Date: Fri, 6 May 2005 12:05:05 +1000
Subject: [Python-Dev] PEP 340: Breaking out.
Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE025204E9@au3010avexu1.global.avaya.com>

Greg Ewing wrote:

> Seems to me it should be up to the block iterator whether
> a break statement gets caught or propagated, since it's
> up to the block iterator whether the construct behaves
> like a loop or not.
> 
> This could be achieved by having a separate exception
> for breaks, as originally proposed.
> 
> If the iterator propagates the Break exception back out,
> the block statement should break any enclosing loop.
> If the iterator wants to behave like a loop, it can
> catch the Break exception and raise StopIteration
> instead.

In this scenario (and I'm not saying I approve or disapprove) I think
BreakIteration should inherit from StopIteration (thus retaining the
existing PEP 340 semantics if uncaught)::

    Iteration
     |
     +- ContinueIteration
     |
     +- StopIteration
         |
         +- BreakIteration

I think no matter what it would be useful to be able to separate a break
from a stop sometimes, but there are many cases where they are the same
thing.

Tim Delaney

From s.percivall at chello.se  Fri May  6 04:13:51 2005
From: s.percivall at chello.se (Simon Percivall)
Date: Fri, 6 May 2005 04:13:51 +0200
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <427ACE8A.4040302@canterbury.ac.nz>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
	<4279FB92.5050501@gmail.com>
	<FFF043AA-1198-49A9-91E5-76128BC98933@mac.com>
	<427ACE8A.4040302@canterbury.ac.nz>
Message-ID: <535D6FED-0195-491F-ABE8-D4712A131B6C@chello.se>

On 6 maj 2005, at 03.55, Greg Ewing wrote:
> Seems to me it should be up to the block iterator whether
> a break statement gets caught or propagated, since it's
> up to the block iterator whether the construct behaves
> like a loop or not.

And this is not confusing in what way? Making it depend
means you constantly have to readjust your understanding
of the statement based on the context. And this is _if_
you know how it behaves in the particular case. If you're
trying to understand the source code, having break depend
on something defined somewhere completely else seems like
an obstacle to easy understanding. IMHO, of course.

//Simon


From greg.ewing at canterbury.ac.nz  Fri May  6 04:27:54 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 06 May 2005 14:27:54 +1200
Subject: [Python-Dev] PEP 340: propose to get rid of 'as' keyword
In-Reply-To: <d5cnoa$ds4$1@sea.gmane.org>
References: <1115226841.7909.24.camel@localhost>
	<42791DCB.5050703@hathawaymix.org> <1115238075.10836.4.camel@emperor>
	<d5cnoa$ds4$1@sea.gmane.org>
Message-ID: <427AD62A.40106@canterbury.ac.nz>

Fredrik Lundh wrote:

> the current
> proposal stems from the observation that "for-loop plus generators" in
> today's Python does in fact provide a block implementation that solves
> many use cases in an elegant way.
> 
> PEP 340 builds on this, sorts out a couple of weak points in the current
> design, and adds an elegant syntax for most remaining use cases.

I still can't help feeling we're making a cart/horse
ordering error here, though. Part of me regards the
"for-loop plus generators" idea as an elegant hack,
whose elegance only extends as far as it *doesn't*
require anything beyond existing syntax and semantics.
If new syntax and tricky new interactions with iterators
are needed to support it, it doesn't seem so elegant
any more.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Fri May  6 04:37:47 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 06 May 2005 14:37:47 +1200
Subject: [Python-Dev] PEP 340 - Remaining issues - keyword
In-Reply-To: <427A4AE9.80007@gmail.com>
References: <79990c6b0505050901fe38af1@mail.gmail.com>
	<427A4AE9.80007@gmail.com>
Message-ID: <427AD87B.1030407@canterbury.ac.nz>

Nick Coghlan wrote:

> Something relatively nonsensical, but usefully mnemonic
 > (like 'stmt') may be a good way to go.

How about 'do'?

   do opening(filename) as f:
     ...

   do locking(obj):
     ...

   do carefully(): # :-)
     ...

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Fri May  6 04:54:36 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 06 May 2005 14:54:36 +1200
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <535D6FED-0195-491F-ABE8-D4712A131B6C@chello.se>
References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr>
	<17015.39213.522060.873605@montanaro.dyndns.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
	<4279FB92.5050501@gmail.com>
	<FFF043AA-1198-49A9-91E5-76128BC98933@mac.com>
	<427ACE8A.4040302@canterbury.ac.nz>
	<535D6FED-0195-491F-ABE8-D4712A131B6C@chello.se>
Message-ID: <427ADC6C.6060101@canterbury.ac.nz>

Simon Percivall wrote:

> And this is not confusing in what way?

I don't think it's any less confusing than having a
construct in the first place which can either be a
loop or not. You need to know the semantics of the
block iterator in order to know whether it's a loop.
Once you know that, you know how break behaves (as
long as the iterator is sanely designed).


  Making it depend
> means you constantly have to readjust your understanding
> of the statement based on the context. And this is _if_
> you know how it behaves in the particular case. If you're
> trying to understand the source code, having break depend
> on something defined somewhere completely else seems like
> an obstacle to easy understanding. IMHO, of course.
> 
> //Simon
> 


-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Fri May  6 04:56:50 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 06 May 2005 14:56:50 +1200
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <338366A6D2E2CA4C9DAEAE652E12A1DE025204E9@au3010avexu1.global.avaya.com>
References: <338366A6D2E2CA4C9DAEAE652E12A1DE025204E9@au3010avexu1.global.avaya.com>
Message-ID: <427ADCF2.6030305@canterbury.ac.nz>

Delaney, Timothy C (Timothy) wrote:

> In this scenario (and I'm not saying I approve or disapprove) I think
> BreakIteration should inherit from StopIteration (thus retaining the
> existing PEP 340 semantics if uncaught)::

Not sure I understand. The point of my suggestion was
to *not* retain existing PEP 340 semantics if uncaught,
i.e. break an enclosing loop rather than just terminate
the block statement.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Fri May  6 05:04:58 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 06 May 2005 15:04:58 +1200
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <427A3971.8030400@gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>
	<9768a4652f7ffa1778560b3548609827@xs4all.nl>
	<427A3971.8030400@gmail.com>
Message-ID: <427ADEDA.1050205@canterbury.ac.nz>

Nick Coghlan wrote:

> What does a try statement with neither an except clause nor a finally clause mean?

I guess it would mean the same as

   if 1:
     ...

Not particularly useful, but maybe it's not worth complexifying
the grammar just for the sake of disallowing it.

Also, some people might find it useful for indenting a block
of code for cosmetic reasons, although that could easily
be seen as an abuse...

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Fri May  6 05:38:34 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 06 May 2005 15:38:34 +1200
Subject: [Python-Dev] PEP 340 - For loop cleanup, and feature separation
In-Reply-To: <427AD87B.1030407@canterbury.ac.nz>
References: <79990c6b0505050901fe38af1@mail.gmail.com>
	<427A4AE9.80007@gmail.com> <427AD87B.1030407@canterbury.ac.nz>
Message-ID: <427AE6BA.1060607@canterbury.ac.nz>

I'm still bothered by the idea of for-loops not participating
in the new generator finalization protocol.

It's all very well to say that iterators designed for block
statements shouldn't be used in for-loops, but there may
be more subtle cases to consider, such as

   def file_frobulations(filenames):
     for filename in filenames:
       block opening(filename) as f:
         yield something_derived_from(f)

This is clearly intended to be a normal iterator, not a
block iterator, and thus should be usable in a for-loop.
But if for-loops don't use the finalization protocol,
it won't be guaranteed to work correctly.

So I think that, in the interests of least surprise,
for-loops should provide the same finalization promises
as block statements.

If that is done, it becomes easier to decide whether
the block statement should loop. The answer is probably
no: If you're writing a loop, you use a for-statement;
if you're not, you use a block-statement. This helps
to clearly differentiate the two, and provide
justification for having two statements.

I'm also starting to like the idea of having a
completely separate protocol for block-statements,
and an adaptor of some kind for using generators to
implement them. We would then have two *totally*
separated new features:

1) A block-statement, and an associated protocol
    designed specifically for it, independent of
    any existing protocols.

2) A mechanism for passing values and exceptions into
    generators, which is useful in its own right and
    already has use cases.

With the addition of an adaptor between the block
protocol and the iterator protocol, which can be
implemented *without* needing any further features,
these two features then combine synergistically to
give you block iterators implemented by generators.

This makes a lot more sense to me than trying to
warp the iterator protocol to make it into a block
protocol as well. Let's keep the two things orthogonal.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From tdelaney at avaya.com  Fri May  6 05:58:48 2005
From: tdelaney at avaya.com (Delaney, Timothy C (Timothy))
Date: Fri, 6 May 2005 13:58:48 +1000
Subject: [Python-Dev] PEP 340 - For loop cleanup, and feature separation
Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE025204EE@au3010avexu1.global.avaya.com>

Greg Ewing wrote:

> I'm still bothered by the idea of for-loops not participating
> in the new generator finalization protocol.

I agree - that's always been nagging at me too.

The problem with it is that then you either:

1. Have a guarantee that an iterator will be exhausted when the for loop
exits (as per block statements);

OR

2. Rely on garbage collection (reference counting, etc) to ensure
finalisation.

Personally, I'm of the opinion that we should make a significant break
(no pun intended ;) and have for-loops attempt to ensure that iterators
are exhausted. An iterator could be specifically designed to prevent
that if needed, but in the vast majority of cases an iterator is never
used after the for-loop.

An example of an iterator that was specifically designed to continue
working after its initial for-loop would be::

    def gen():
        try:
            setup1()

            try:
                yield 1
            finally:
                cleanup1()

        except StopIteration:
            pass

        setup2()

        try:
            yield 2
        finally:
            cleanup2()

This allows cleanup, but then continued usage.

> So I think that, in the interests of least surprise,
> for-loops should provide the same finalization promises
> as block statements.

Agreed.

> If that is done, it becomes easier to decide whether
> the block statement should loop. The answer is probably
> no: If you're writing a loop, you use a for-statement;
> if you're not, you use a block-statement. This helps
> to clearly differentiate the two, and provide
> justification for having two statements.

Agreed.

> I'm also starting to like the idea of having a
> completely separate protocol for block-statements

+1

> and an adaptor of some kind for using generators to
> implement them.

My preference would be direct syntactic support...

Tim Delaney

From mal at egenix.com  Fri May  6 09:17:26 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Fri, 06 May 2005 09:17:26 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>
	<42794ABD.2080405@hathawaymix.org>
	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
Message-ID: <427B1A06.4010004@egenix.com>

Nicholas Bastin wrote:
> On May 4, 2005, at 6:20 PM, Shane Hathaway wrote:
> 
>>>Nicholas Bastin wrote:
>>>
>>>
>>>>"This type represents the storage type which is used by Python
>>>>internally as the basis for holding Unicode ordinals.  Extension 
>>>>module
>>>>developers should make no assumptions about the size of this type on
>>>>any given platform."
>>>
>>>
>>>But people want to know "Is Python's Unicode 16-bit or 32-bit?"
>>>So the documentation should explicitly say "it depends".
>>
>>On a related note, it would be help if the documentation provided a
>>little more background on unicode encoding.  Specifically, that UCS-2 
>>is
>>not the same as UTF-16, even though they're both two bytes wide and 
>>most
>>of the characters are the same.  UTF-16 can encode 4 byte characters,
>>while UCS-2 can't.  A Py_UNICODE is either UCS-2 or UCS-4.  It took me
> 
> I'm not sure the Python documentation is the place to teach someone 
> about unicode.  The ISO 10646 pretty clearly defines UCS-2 as only 
> containing characters in the BMP (plane zero).  On the other hand, I 
> don't know why python lets you choose UCS-2 anyhow, since it's almost 
> always not what you want.

You've got that wrong: Python let's you choose UCS-4 -
UCS-2 is the default.

Note that Python's Unicode codecs UTF-8 and UTF-16
are surrogate aware and thus support non-BMP code points
regardless of the build type: A UCS2-build of Python will
store a non-BMP code point as UTF-16 surrogate pair in the
Py_UNICODE buffer while a UCS4 build will store it as a
single value. Decoding is surrogate aware too, so a UTF-16
surrogate pair in a UCS2 build will get treated as single
Unicode code point.

Ideally, the Python programmer should not really need to
know all this and I think we've achieved that up to certain
point (Unicode can be complicated - there's nothing to hide there).
However, the C progammer using the Python C API to interface
to some other Unicode implementation will need to know these
details.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 06 2005)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From mal at egenix.com  Fri May  6 09:19:28 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Fri, 06 May 2005 09:19:28 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <d5b45s$ee8$3@sea.gmane.org>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<4qdjm0rt.fsf@python.net>
	<d5b45s$ee8$3@sea.gmane.org>
Message-ID: <427B1A80.9060308@egenix.com>

Fredrik Lundh wrote:
> Thomas Heller wrote:
> 
> 
>>AFAIK, you can configure Python to use 16-bits or 32-bits Unicode chars,
>>independend from the size of wchar_t.  The HAVE_USABLE_WCHAR_T macro
>>can be used by extension writers to determine if Py_UNICODE is the same as
>>wchar_t.
> 
> 
> note that "usable" is more than just "same size"; it also implies that widechar
> predicates (iswalnum etc) works properly with Unicode characters, under all
> locales.

Only if you intend to use --with-wctypes; a configure option which
will go away soon (for exactly the reason you are referring to: the
widechar predicates don't work properly under all locales).

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 06 2005)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From mal at egenix.com  Fri May  6 09:25:10 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Fri, 06 May 2005 09:25:10 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <ed6dac3aa136985e60159713ff1d75ac@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>
	<ed6dac3aa136985e60159713ff1d75ac@opnet.com>
Message-ID: <427B1BD6.1060206@egenix.com>

Nicholas Bastin wrote:
> On May 4, 2005, at 6:03 PM, Martin v. L?wis wrote:
> 
> 
>>Nicholas Bastin wrote:
>>
>>>"This type represents the storage type which is used by Python
>>>internally as the basis for holding Unicode ordinals.  Extension 
>>>module
>>>developers should make no assumptions about the size of this type on
>>>any given platform."
>>
>>But people want to know "Is Python's Unicode 16-bit or 32-bit?"
>>So the documentation should explicitly say "it depends".
> 
> 
> The important piece of information is that it is not guaranteed to be a 
> particular one of those sizes.  Once you can't guarantee the size, no 
> one really cares what size it is.  The documentation should discourage 
> developers from attempting to manipulate Py_UNICODE directly, which, 
> other than trivia, is the only reason why someone would care what size 
> the internal representation is.

I don't see why you shouldn't use Py_UNICODE buffer directly.
After all, the reason why we have that typedef is to make it
possible to program against an abstract type - regardless of
its size on the given platform.

In that respect it is similar to wchar_t (and all the other
*_t typedefs in C).

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 06 2005)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From p.f.moore at gmail.com  Fri May  6 10:39:59 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Fri, 6 May 2005 09:39:59 +0100
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <d11dcfba05050514344a8f4517@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org> <d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<4278D7D7.2040805@gmail.com>
	<d11dcfba050504083551bb0a1e@mail.gmail.com>
	<4279F909.6030206@gmail.com>
	<d11dcfba050505080585fa710@mail.gmail.com> <427A4422.4@gmail.com>
	<79990c6b05050512582e52a7af@mail.gmail.com>
	<d11dcfba05050514344a8f4517@mail.gmail.com>
Message-ID: <79990c6b05050601395211a722@mail.gmail.com>

On 5/5/05, Steven Bethard <steven.bethard at gmail.com> wrote:
> On 5/5/05, Paul Moore <p.f.moore at gmail.com> wrote:
> > And does your proposal allow for "continue EXPR" as supported by PEP
> > 340? I can't see that it could, given that your proposal treats block
> > statements as not being loops.
> 
> Read PEP 340 again -- the "continue EXPR" syntax is orthogonal to the
> discussion -- PEP 340 adds it for *all* for loops, so for loops with
> the non-looping block statements would also be able to use it.

I know this. But we're talking here about Nick's new proposal for a
non-looping block. All I am saying is that the new proposal needs to
include this orthogonal feature. If it's a modification to PEP 340,
that will come naturally. If it's a modification to PEP 310, it won't.
A new PEP needs to include it.

I am very much against picking bits out of a number of PEPs - that was
implicit in my earlier post - sorry, I should have made it explicit.
Specifically, PEP 340 should be accepted (possibly with modifications)
as a whole, or rejected outright - no "rejected, but can we have
continue EXPR in any case, as it's orthogonal" status exists...

> > The looping behaviour is a (fairly nasty) wart, but I'm not sure I
> > would insist on removing it at the cost of damaging other features I
> > like.
> 
> I don't think it "damages" any features.  Are there features you still
> think the non-looping proposal removes?  (I'm not counting orthogonal
> feautres like "continue EXPR" which could easily be added as an
> entirely separate PEP.)

I *am* specifically referring to these "orthogonal" features. Removal
of looping by modification of PEP 340 will do no such "damage", I
agree - but removal by accepting an updated PEP 310, or a new PEP,
*will* (unless the "entirely separate PEP" you mention is written and
accepted along with the non-looping PEP - and I don't think that will
happen).

Thanks for making me clarify what I meant. I left a little too much
implicit in my previous post.

Paul.

From rrr at ronadam.com  Fri May  6 10:40:10 2005
From: rrr at ronadam.com (Ron Adam)
Date: Fri, 06 May 2005 04:40:10 -0400
Subject: [Python-Dev] PEP 340: Examples as class's.
In-Reply-To: <427AB674.5010809@ronadam.com>
References: <427AB674.5010809@ronadam.com>
Message-ID: <427B2D6A.4090803@ronadam.com>

Ron Adam wrote:


A minor correction to the Block class due to re-editing.

>     def __call__(self, *args):
>         self.block(*args)
>         self.__del__()

This should have been.

     def __call__(self, *args):
         try:
             self.block(*args)
         except Exception, self.__err__:
             pass
         self.__del__()

Which catches the error in the overriden "block", (I need to change that 
to say "body"), method so it can be re-raised after the "final" method 
is run. The "final" method can handle it if it chooses.

Thanks to Jim Jewett for noticing. It should make more sense now.  :-)


In example (1.), Lock_It lost a carriage return. It should be.

     class Lock_It(Get_Lock):
         def block(self):
             print "Do stuff while locked"

     Lock_It(mylock())()


And example (3.) should be, although it may not run as is...

## 3. A template for committing or rolling back a database:
class Transactional(Block):
     def start(self, db):
         self.db = db
	self.cursor = self.db.cursor()
     def final(self):
         if self.__err__:
             self.db.rollback()
             print "db rolled back due to err:", __err__
             self.__err__ = None
         else:
             db.commit()
     def block(self, batch):
         for statement in batch:
             self.cursor.execute(statement)

statement_batch = [
     "insert into PEP340 values ('Guido','BDFL')",
     "insert into PEP340 values ('More examples are needed')"]
db = pgdb.connect(dsn = 'localhost:pythonpeps')
Transactional(db)(statement_batch)
disconnect(db)

Another Block class could be used for connecting and disconecting.


Cheers, Ron_Adam










From p.f.moore at gmail.com  Fri May  6 10:41:42 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Fri, 6 May 2005 09:41:42 +0100
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <427ACE8A.4040302@canterbury.ac.nz>
References: <20050503150510.GA13595@onegeek.org>
	<ca471dc20505031013287a2e92@mail.gmail.com>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<f5fa4a1e1fa134d1068bdfd53a995c71@yahoo.com>
	<4279FB92.5050501@gmail.com>
	<FFF043AA-1198-49A9-91E5-76128BC98933@mac.com>
	<427ACE8A.4040302@canterbury.ac.nz>
Message-ID: <79990c6b0505060141604acaff@mail.gmail.com>

On 5/6/05, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Seems to me it should be up to the block iterator whether
> a break statement gets caught or propagated, since it's
> up to the block iterator whether the construct behaves
> like a loop or not.
> 
> This could be achieved by having a separate exception
> for breaks, as originally proposed.
> 
> If the iterator propagates the Break exception back out,
> the block statement should break any enclosing loop.
> If the iterator wants to behave like a loop, it can
> catch the Break exception and raise StopIteration
> instead.

Yes, that's exactly what I was trying to say! I don't know if it's
achievable in practice, but the fact that it was in the original
proposal (something I'd forgotten, if indeed I ever realised) makes it
seem more likely to me.

Paul.

From nidoizo at yahoo.com  Fri May  6 11:09:39 2005
From: nidoizo at yahoo.com (Nicolas Fleury)
Date: Fri, 06 May 2005 05:09:39 -0400
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <79990c6b05050512582e52a7af@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org>	<17015.48830.223391.390538@montanaro.dyndns.org>	<ca471dc2050503113127f938b0@mail.gmail.com>	<d58tkb$fvp$1@sea.gmane.org>	<79990c6b050504015762d004ac@mail.gmail.com>	<4278D7D7.2040805@gmail.com>	<d11dcfba050504083551bb0a1e@mail.gmail.com>	<4279F909.6030206@gmail.com>	<d11dcfba050505080585fa710@mail.gmail.com>
	<427A4422.4@gmail.com> <79990c6b05050512582e52a7af@mail.gmail.com>
Message-ID: <427B3453.40409@yahoo.com>

Paul Moore wrote:
> On 5/5/05, Nick Coghlan <ncoghlan at gmail.com> wrote:
>>   2. Manual protocol implementations are _significantly_ easier to write
> 
> Hmm, I've not tried so I'll have to take your word for this. But I
> don't imagine writing manual implementations much - one of the key
> features I like about Guido's proposal is that generators can be used,
> and the implementation is a clear template, with "yield" acting as a
> "put the block here" marker (yes, I know that's an
> oversimplification!).

If using a generator is easier to code (but I tend to agree with Nick), 
a new type, a one-shot-generator (not really a generator, but some type 
of continuation), as suggested Steven Bethard with stmt, could be created:

     def opening(filename, mode="r"):
         f = open(filename, mode)
         try:
             yield break f
         finally:
             f.close()

I prefer Nick's proposal however, since it simplifies non-looping 
constructs (no generator-template, break of parent loop supported), 
while leaving looping constructs (a minority in IMO) possible using a 
for, making things even clearer to me (but harder to implement).  I'm 
still not convinced at all that using generators to implement a 
acquire/release pattern is a good idea...

Regards,
Nicolas


From mwh at python.net  Fri May  6 11:48:56 2005
From: mwh at python.net (Michael Hudson)
Date: Fri, 06 May 2005 10:48:56 +0100
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <79990c6b05050512582e52a7af@mail.gmail.com> (Paul Moore's
	message of "Thu, 5 May 2005 20:58:12 +0100")
References: <20050503150510.GA13595@onegeek.org>
	<17015.48830.223391.390538@montanaro.dyndns.org>
	<ca471dc2050503113127f938b0@mail.gmail.com>
	<d58tkb$fvp$1@sea.gmane.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<4278D7D7.2040805@gmail.com>
	<d11dcfba050504083551bb0a1e@mail.gmail.com>
	<4279F909.6030206@gmail.com>
	<d11dcfba050505080585fa710@mail.gmail.com> <427A4422.4@gmail.com>
	<79990c6b05050512582e52a7af@mail.gmail.com>
Message-ID: <2mis1wwupj.fsf@starship.python.net>

Paul Moore <p.f.moore at gmail.com> writes:

> On 5/5/05, Nick Coghlan <ncoghlan at gmail.com> wrote:
>> Well, Michael Hudson and Paul Moore are the current authors of PEP 310, so
>> updating it with any of my ideas would be their call.
>
> I'm willing to consider an update - I don't know Michael's view. 

I'd slightly prefer PEP 310 to remain a very simple proposal, but
don't really have the energy to argue with someone who thinks
rewriting it makes more sense than creating a new PEP.

Cheers,
mwh

-- 
  Solaris: Shire horse that dreams of being a race horse,
  blissfully unaware that its owners don't quite know whether
  to put it out to grass, to stud, or to the knackers yard.
                           -- Jim's pedigree of operating systems, asr

From tdickenson at devmail.geminidataloggers.co.uk  Fri May  6 12:20:31 2005
From: tdickenson at devmail.geminidataloggers.co.uk (Toby Dickenson)
Date: Fri, 6 May 2005 11:20:31 +0100
Subject: [Python-Dev] PEP 340: Non-looping version (aka PEP 310 redux)
In-Reply-To: <427A35DA.8050505@iinet.net.au>
References: <427A35DA.8050505@iinet.net.au>
Message-ID: <200505061120.31117.tdickenson@devmail.geminidataloggers.co.uk>

On Thursday 05 May 2005 16:03, Nick Coghlan wrote:
> The discussion on the meaning of break when nesting a PEP 340 block
> statement inside a for loop has given me some real reasons to prefer PEP
> 310's single pass  semantics for user defined statements 

That also solves a problem with resource acquisition block generators that I 
hadnt been able to articulate until now. What about resources whose lifetimes 
are more complex than a lexical block, where you cant use a block statement? 
It seems quite natural for code that want to manage its own resources to call 
__enter__ and __exit__ directly. Thats not true of the block generator API.



-- 
Toby Dickenson

From michele.simionato at gmail.com  Fri May  6 12:35:41 2005
From: michele.simionato at gmail.com (Michele Simionato)
Date: Fri, 6 May 2005 06:35:41 -0400
Subject: [Python-Dev] my first post: asking about a "decorator" module
In-Reply-To: <002a01c55158$64275b80$11bd2c81@oemcomputer>
References: <4edc17eb0505042347a9d02be@mail.gmail.com>
	<002a01c55158$64275b80$11bd2c81@oemcomputer>
Message-ID: <4edc17eb050506033557b6333b@mail.gmail.com>

On 5/5/05, Raymond Hettinger <python at rcn.com> wrote:
> 
> Yes, there has been quite a bit of interest including several ASPN
> recipes and a wiki:
> 
>    http://www.python.org/moin/PythonDecoratorLibrary

Thanks, I didn't know about that page. BTW, I notice that all the decorators
in that page are improper, in the sense that they change the signature of
the function they decorate. So, all those recipes would need some help
from my decorator module, to make them proper ;-)

http://www.phyast.pitt.edu/~micheles/python/decorator.zip

From nidoizo at yahoo.com  Fri May  6 14:33:29 2005
From: nidoizo at yahoo.com (Nicolas Fleury)
Date: Fri, 06 May 2005 08:33:29 -0400
Subject: [Python-Dev] PEP 340: Breaking out.
In-Reply-To: <ca471dc205050315392e00f98c@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org>
	<427797D5.8030207@cirad.fr>	<17015.39213.522060.873605@montanaro.dyndns.org>	<ca471dc20505031013287a2e92@mail.gmail.com>	<17015.48830.223391.390538@montanaro.dyndns.org>	<ca471dc2050503113127f938b0@mail.gmail.com>	<d58tkb$fvp$1@sea.gmane.org>
	<ca471dc205050315392e00f98c@mail.gmail.com>
Message-ID: <d5fnkp$342$1@sea.gmane.org>

Guido van Rossum wrote:
>>Maybe generators are not the way to go, but could be
>>supported natively by providing a __block__ function, very similarly to
>>sequences providing an __iter__ function for for-loops?
> 
> Sorry, I have no idea what you are proposing here.

I was suggesting that the feature could be a PEP310-like object and that 
a __block__ function (or whatever) of the generator could return such an 
object.  But at this point, Nick's proposition is what I prefer.  I find 
the use of generators very elegant, but I'm still unconvinced it is a 
good idea to use them to implement an acquire/release pattern.  Even if 
another continuation mechanism would be used (like Steven's idea), it 
would still be a lot of concepts used to implement acquire/release.

Regards,
Nicolas


From raymond.hettinger at verizon.net  Fri May  6 15:59:20 2005
From: raymond.hettinger at verizon.net (Raymond Hettinger)
Date: Fri, 06 May 2005 09:59:20 -0400
Subject: [Python-Dev] my first post: asking about a "decorator" module
In-Reply-To: <4edc17eb050506033557b6333b@mail.gmail.com>
Message-ID: <002401c55243$cd6a2820$11bd2c81@oemcomputer>

> > Yes, there has been quite a bit of interest including several ASPN
> > recipes and a wiki:
> >
> >    http://www.python.org/moin/PythonDecoratorLibrary
> 
> Thanks, I didn't know about that page. BTW, I notice that all the
> decorators
> in that page are improper, in the sense that they change the signature
of
> the function they decorate. 

Signature changing and signature preserving are probably better
classifications than proper and improper.  Even then, some decorators
like atexit() and classmethod() may warrant their own special
categories.


Raymond


From michele.simionato at gmail.com  Fri May  6 16:41:08 2005
From: michele.simionato at gmail.com (Michele Simionato)
Date: Fri, 6 May 2005 10:41:08 -0400
Subject: [Python-Dev] The decorator module
In-Reply-To: <fb6fbf560505060730789906e2@mail.gmail.com>
References: <fb6fbf560505060730789906e2@mail.gmail.com>
Message-ID: <4edc17eb0505060741635ecde8@mail.gmail.com>

On 5/6/05, Jim Jewett <jimjjewett at gmail.com> wrote:
> Thank you; this is very good.
> 
> I added a link to it from http://www.python.org/moin/PythonDecoratorLibrary;
> please also consider adding a version number and publishing via PyPI.

Yes, this was in my plans. For the moment,  however, this is just version 0.1,
I want to wait a bit before releasing an official release.

> Incidentally, would the resulting functions be a bit faster if you compiled
> the lambda instead of repeatedly eval ing it, or does the eval overhead still
> apply?
> 
> -jJ
> 

Honestly, I don't care, since "eval" happens only once at decoration time.
There is no "eval" overhead at calling time, so I do not expect to have
problems. I am waiting for volunteers to perform profiling and
performance analysis ;)

            Michele Simionato

From gvanrossum at gmail.com  Fri May  6 16:51:03 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 6 May 2005 07:51:03 -0700
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <427ADEDA.1050205@canterbury.ac.nz>
References: <d59vll$4qf$1@sea.gmane.org>
	<9768a4652f7ffa1778560b3548609827@xs4all.nl>
	<427A3971.8030400@gmail.com> <427ADEDA.1050205@canterbury.ac.nz>
Message-ID: <ca471dc205050607513f9c1c9b@mail.gmail.com>

[Nick Coghlan]
> 
> > What does a try statement with neither an except clause nor a finally clause mean?

[Greg Ewing]
> I guess it would mean the same as
> 
>    if 1:
>      ...
> 
> Not particularly useful, but maybe it's not worth complexifying
> the grammar just for the sake of disallowing it.
> 
> Also, some people might find it useful for indenting a block
> of code for cosmetic reasons, although that could easily
> be seen as an abuse...

I strongly disagree with this. It should be this:

try_stmt: 'try' ':' suite
            (
                except_clause ':' suite)+
                ['else' ':' suite] ['finally' ':' suite]
            |
                'finally' ':' suite
            )

There is no real complexity in this grammar, it's unambiguous, it's an
easy enough job for the code generator, and it catches a certain class
of mistakes (like mis-indenting some code).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Fri May  6 16:55:22 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 6 May 2005 07:55:22 -0700
Subject: [Python-Dev] The decorator module
In-Reply-To: <4edc17eb0505060741635ecde8@mail.gmail.com>
References: <fb6fbf560505060730789906e2@mail.gmail.com>
	<4edc17eb0505060741635ecde8@mail.gmail.com>
Message-ID: <ca471dc205050607553b4bafed@mail.gmail.com>

[jJ]
> > Incidentally, would the resulting functions be a bit faster if you compiled
> > the lambda instead of repeatedly eval ing it, or does the eval overhead still
> > apply?

[Michele]
> Honestly, I don't care, since "eval" happens only once at decoration time.
> There is no "eval" overhead at calling time, so I do not expect to have
> problems. I am waiting for volunteers to perform profiling and
> performance analysis ;)

Watch out. I didn't see the code referred to, but realize that eval is
*very* expensive on some other implementations of Python (Jython and
IronPython). Eval should only be used if there is actual user-provided
input that you don't know yet when your module is compiled; not to get
around some limitation in the language there are usually ways around
that, and occasionally we add one, e.g. getattr()).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From michele.simionato at gmail.com  Fri May  6 17:05:07 2005
From: michele.simionato at gmail.com (Michele Simionato)
Date: Fri, 6 May 2005 11:05:07 -0400
Subject: [Python-Dev] The decorator module
In-Reply-To: <ca471dc205050607553b4bafed@mail.gmail.com>
References: <fb6fbf560505060730789906e2@mail.gmail.com>
	<4edc17eb0505060741635ecde8@mail.gmail.com>
	<ca471dc205050607553b4bafed@mail.gmail.com>
Message-ID: <4edc17eb05050608053199fbb7@mail.gmail.com>

On 5/6/05, Guido van Rossum <gvanrossum at gmail.com> wrote:
> [Michele]
> > Honestly, I don't care, since "eval" happens only once at decoration time.
> > There is no "eval" overhead at calling time, so I do not expect to have
> > problems. I am waiting for volunteers to perform profiling and
> > performance analysis ;)
> 
> Watch out. I didn't see the code referred to, but realize that eval is
> *very* expensive on some other implementations of Python (Jython and
> IronPython). Eval should only be used if there is actual user-provided
> input that you don't know yet when your module is compiled; not to get
> around some limitation in the language there are usually ways around
> that, and occasionally we add one, e.g. getattr()).

I actually posted the code on c.l.p. one month ago asking if there was
a way to avoid "eval", but I had no answer. So, let me repost the code
here and see if somebody comes out with a good solution.
It is only ~30 lines long (+ ~30 of comments & docstrings)

## I suggest you uncomment the 'print lambda_src' statement in _decorate
## to understand what is going on.

import inspect

def _signature_gen(func, rm_defaults=False):
    argnames, varargs, varkwargs, defaults = inspect.getargspec(func)
    argdefs = defaults or ()
    n_args = func.func_code.co_argcount
    n_default_args = len(argdefs)
    n_non_default_args = n_args - n_default_args    
    non_default_names = argnames[:n_non_default_args]
    default_names = argnames[n_non_default_args:]
    for name in non_default_names:
        yield "%s" % name
    for i, name in enumerate(default_names):
        if rm_defaults:
            yield name
        else:
            yield "%s = arg[%s]" % (name, i) 
    if varargs:
        yield "*%s" % varargs
    if varkwargs:
        yield "**%s" % varkwargs

def _decorate(func, caller):
    signature = ", ".join(_signature_gen(func))
    variables = ", ".join(_signature_gen(func, rm_defaults=True))   
    lambda_src = "lambda %s: call(func, %s)" % (signature, variables)
    # print lambda_src # for debugging
    evaldict = dict(func=func, call=caller, arg=func.func_defaults or ())
    dec_func = eval(lambda_src, evaldict)
    dec_func.__name__ = func.__name__
    dec_func.__doc__ = func.__doc__
    dec_func.__dict__ = func.__dict__ # copy if you want to avoid sharing
    return dec_func

class decorator(object):
    """General purpose decorator factory: takes a caller function as
    input and returns a decorator. A caller function is any function like this:

    def caller(func, *args, **kw):
        # do something
        return func(*args, **kw)
    
    Here is an example of usage:

    >>> @decorator
    ... def chatty(f, *args, **kw):
    ...     print "Calling %r" % f.__name__
    ...     return f(*args, **kw)
    >>> @chatty
    ... def f(): pass
    >>> f()
    Calling 'f'
    """
    def __init__(self, caller):
        self.caller = caller
    def __call__(self, func):
        return _decorate(func, self.caller)

Michele Simionato

From gvanrossum at gmail.com  Fri May  6 17:20:24 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 6 May 2005 08:20:24 -0700
Subject: [Python-Dev] PEP 340 - Remaining issues - keyword
In-Reply-To: <427AD87B.1030407@canterbury.ac.nz>
References: <79990c6b0505050901fe38af1@mail.gmail.com>
	<427A4AE9.80007@gmail.com> <427AD87B.1030407@canterbury.ac.nz>
Message-ID: <ca471dc20505060820476cdd6c@mail.gmail.com>

[Greg Ewing]
> How about 'do'?
> 
>    do opening(filename) as f:
>      ...
> 
>    do locking(obj):
>      ...
> 
>    do carefully(): # :-)
>      ...

I've been thinking of that too. It's short, and in a nostalgic way
conveys that it's a loop, without making it too obvious. (Those too
young to get that should Google for do-loop. :-)

I wonder how many folks call their action methods do() though.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From tim.peters at gmail.com  Fri May  6 17:29:55 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Fri, 6 May 2005 11:29:55 -0400
Subject: [Python-Dev] PEP 340 - Remaining issues - keyword
In-Reply-To: <ca471dc20505060820476cdd6c@mail.gmail.com>
References: <79990c6b0505050901fe38af1@mail.gmail.com>
	<427A4AE9.80007@gmail.com> <427AD87B.1030407@canterbury.ac.nz>
	<ca471dc20505060820476cdd6c@mail.gmail.com>
Message-ID: <1f7befae05050608294a5c34d3@mail.gmail.com>

[Guido]
> ...
> I wonder how many folks call their action methods do() though.

A little Google(tm)-ing suggests it's not all that common, although it
would break Zope on NetBSD:

    http://www.zope.org/Members/tino/ZopeNetBSD

I can live with that <wink>.

From gvanrossum at gmail.com  Fri May  6 17:34:42 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 6 May 2005 08:34:42 -0700
Subject: [Python-Dev] PEP 340 -- Clayton's keyword?
In-Reply-To: <4279AB27.3010405@canterbury.ac.nz>
References: <ca471dc20505021755518773c8@mail.gmail.com>
	<20050503201400.GE30548@solar.trillke.net>
	<ca471dc2050503132010abb4df@mail.gmail.com>
	<20020107054513.566d74ed@localhost.localdomain>
	<d5b4dr$fa8$1@sea.gmane.org>
	<b348a0850505041157dfb3659@mail.gmail.com>
	<d5b7f0$spf$1@sea.gmane.org> <42792888.10209@hathawaymix.org>
	<4279AB27.3010405@canterbury.ac.nz>
Message-ID: <ca471dc205050608346a2aa4f1@mail.gmail.com>

[Greg Ewing]
> How about user-defined keywords?
> 
> Suppose you could write
> 
>    statement opening
> 
>    def opening(path, mode):
>      f = open(path, mode)
>      try:
>        yield
>      finally:
>        close(f)
> 
> which would then allow
> 
>    opening "myfile", "w" as f:
>      do_something_with(f)
[etc.]

This one is easy to reject outright:

- I have no idea how that would be implemented, especially since you
propose allowing to use the newly minted keyword as the target of an
import. I'm sure it can be done, but it would be a major departure
from the current parser/lexer separation and would undoubtedly be an
extra headache for Jython and IronPython, which use standard
components for their parsing.

- It doesn't seem to buy you much -- just dropping two parentheses.

- I don't see how it would handle the case where the block-controller
is a method call or something else beyond a simple identifier.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pinard at iro.umontreal.ca  Fri May  6 17:35:16 2005
From: pinard at iro.umontreal.ca (=?iso-8859-1?Q?Fran=E7ois?= Pinard)
Date: Fri, 6 May 2005 11:35:16 -0400
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <ca471dc205050607513f9c1c9b@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>
	<9768a4652f7ffa1778560b3548609827@xs4all.nl>
	<427A3971.8030400@gmail.com> <427ADEDA.1050205@canterbury.ac.nz>
	<ca471dc205050607513f9c1c9b@mail.gmail.com>
Message-ID: <20050506153516.GA11224@phenix.progiciels-bpi.ca>

[Guido van Rossum]
> [Nick Coghlan]

> > > What does a try statement with neither an except clause nor a
> > > finally clause mean?

> [Greg Ewing]
>
> > I guess it would mean the same as

> >    if 1:
> >      ...

> I strongly disagree with this.  [...]

Allow me a quick comment on this issue.

It happens once in a while that I want to comment out the except clauses
of a try statement, when I want the traceback of the inner raising, for
debugging purposes.  Syntax forces me to also comment the `try:' line,
and indent out the lines following the `try:' line.  And of course, the
converse operation once debugging is done.  This is slightly heavy.

At a few places, Python is helpful for such editorial things, for
example, allowing a spurious trailing comma at end of lists, dicts,
tuples. `pass' is also useful as a place holder for commented code.

At least, the new proposed syntax would allow for some:

     finally:
         pass

addendum when commenting except clauses, simplifying the editing job for
the `try:' line and those following.


P.S. - Another detail, while on this subject.  On the first message I've read
on this topic, the original poster wrote something like:

    f = None
    try:
        f = action1(...)
    ...
    finally:
        if f is not None:
            action2(f)

The proposed syntax did not repeat this little part about "None", quoted
above, so suggesting an over-good feeling about syntax efficiency.
While nice, the syntax still does not solve this detail, which occurs
frequently in my experience.  Oh, I do not have solutions to offer, but
it might be worth a thought from the mighty thinkers of this list :-)

-- 
Fran?ois Pinard   http://pinard.progiciels-bpi.ca

From gvanrossum at gmail.com  Fri May  6 17:41:32 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 6 May 2005 08:41:32 -0700
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <20050506153516.GA11224@phenix.progiciels-bpi.ca>
References: <d59vll$4qf$1@sea.gmane.org>
	<9768a4652f7ffa1778560b3548609827@xs4all.nl>
	<427A3971.8030400@gmail.com> <427ADEDA.1050205@canterbury.ac.nz>
	<ca471dc205050607513f9c1c9b@mail.gmail.com>
	<20050506153516.GA11224@phenix.progiciels-bpi.ca>
Message-ID: <ca471dc2050506084130747b82@mail.gmail.com>

[Fran?ois Pinard]
> It happens once in a while that I want to comment out the except clauses
> of a try statement, when I want the traceback of the inner raising, for
> debugging purposes.  Syntax forces me to also comment the `try:' line,
> and indent out the lines following the `try:' line.  And of course, the
> converse operation once debugging is done.  This is slightly heavy.

I tend to address this by substituting a different exception. I don't
see the use case common enough to want to allow dangling try-suites.

> P.S. - Another detail, while on this subject.  On the first message I've read
> on this topic, the original poster wrote something like:
> 
>     f = None
>     try:
>         f = action1(...)
>     ...
>     finally:
>         if f is not None:
>             action2(f)
> 
> The proposed syntax did not repeat this little part about "None", quoted
> above, so suggesting an over-good feeling about syntax efficiency.
> While nice, the syntax still does not solve this detail, which occurs
> frequently in my experience.  Oh, I do not have solutions to offer, but
> it might be worth a thought from the mighty thinkers of this list :-)

I don't understand your issue here. What is the problem with that
code? Perhaps it ought to be rewritten as

f = action1()
try:
    ...
finally:
    action2(f)

I can't see how this would ever do something different than your version.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pinard at iro.umontreal.ca  Fri May  6 18:02:33 2005
From: pinard at iro.umontreal.ca (=?iso-8859-1?Q?Fran=E7ois?= Pinard)
Date: Fri, 6 May 2005 12:02:33 -0400
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <ca471dc2050506084130747b82@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>
	<9768a4652f7ffa1778560b3548609827@xs4all.nl>
	<427A3971.8030400@gmail.com> <427ADEDA.1050205@canterbury.ac.nz>
	<ca471dc205050607513f9c1c9b@mail.gmail.com>
	<20050506153516.GA11224@phenix.progiciels-bpi.ca>
	<ca471dc2050506084130747b82@mail.gmail.com>
Message-ID: <20050506160233.GA13176@phenix.progiciels-bpi.ca>

[Guido van Rossum]

> [Fran?ois Pinard]
>
> > It happens once in a while that I want to comment out the except
> > clauses of a try statement, when I want the traceback of the inner
> > raising, for debugging purposes.  Syntax forces me to also comment
> > the `try:' line, and indent out the lines following the `try:' line.
> > And of course, the converse operation once debugging is done.  This
> > is slightly heavy.

> I tend to address this by substituting a different exception. I don't
> see the use case common enough to want to allow dangling try-suites.

Quite agreed.  I just wanted to tell there was a need.

> > P.S. - Another detail, while on this subject.  On the first message
> > I've read on this topic, the original poster wrote something like:

> >     f = None
> >     try:
> >         f = action1(...)
> >     ...
> >     finally:
> >         if f is not None:
> >             action2(f)

> > The proposed syntax did not repeat this little part about "None",
> > quoted above, so suggesting an over-good feeling about syntax
> > efficiency.  While nice, the syntax still does not solve this
> > detail, which occurs frequently in my experience.  Oh, I do not have
> > solutions to offer, but it might be worth a thought from the mighty
> > thinkers of this list :-)

> I don't understand your issue here. What is the problem with that
> code? Perhaps it ought to be rewritten as

> f = action1()
> try:
>     ...
> finally:
>     action2(f)

> I can't see how this would ever do something different than your version.

Oh, the problem is that if `action1()' raises an exception (and this is
why it has to be within the `try', not before), `f' will not receive
a value, and so, may not be initialised in all cases.  The (frequent)
stunt is a guard so this never becomes a problem.

-- 
Fran?ois Pinard   http://pinard.progiciels-bpi.ca

From reinhold-birkenfeld-nospam at wolke7.net  Fri May  6 18:00:46 2005
From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld)
Date: Fri, 06 May 2005 18:00:46 +0200
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <ca471dc2050506084130747b82@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>	<9768a4652f7ffa1778560b3548609827@xs4all.nl>	<427A3971.8030400@gmail.com>
	<427ADEDA.1050205@canterbury.ac.nz>	<ca471dc205050607513f9c1c9b@mail.gmail.com>	<20050506153516.GA11224@phenix.progiciels-bpi.ca>
	<ca471dc2050506084130747b82@mail.gmail.com>
Message-ID: <d5g40t$hle$1@sea.gmane.org>

Guido van Rossum wrote:
> [Fran?ois Pinard]
>> It happens once in a while that I want to comment out the except clauses
>> of a try statement, when I want the traceback of the inner raising, for
>> debugging purposes.  Syntax forces me to also comment the `try:' line,
>> and indent out the lines following the `try:' line.  And of course, the
>> converse operation once debugging is done.  This is slightly heavy.
> 
> I tend to address this by substituting a different exception. I don't
> see the use case common enough to want to allow dangling try-suites.

Easy enough, adding "raise" at the top of the except clause also solves the problem.

>> P.S. - Another detail, while on this subject.  On the first message I've read
>> on this topic, the original poster wrote something like:
>> 
>>     f = None
>>     try:
>>         f = action1(...)
>>     ...
>>     finally:
>>         if f is not None:
>>             action2(f)
>> 
>> The proposed syntax did not repeat this little part about "None", quoted
>> above, so suggesting an over-good feeling about syntax efficiency.
>> While nice, the syntax still does not solve this detail, which occurs
>> frequently in my experience.  Oh, I do not have solutions to offer, but
>> it might be worth a thought from the mighty thinkers of this list :-)
> 
> I don't understand your issue here. What is the problem with that
> code? Perhaps it ought to be rewritten as
> 
> f = action1()
> try:
>     ...
> finally:
>     action2(f)
> 
> I can't see how this would ever do something different than your version.

Well, in the original the call to action1 was wrapped in an additional try-except
block.

f = None
try:
    try:
        f = action1()
    except:
        print "error"
finally:
    if f is not None:
        action2(f)


Reinhold


-- 
Mail address is perfectly valid!


From fredrik at pythonware.com  Fri May  6 18:32:30 2005
From: fredrik at pythonware.com (Fredrik Lundh)
Date: Fri, 6 May 2005 18:32:30 +0200
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
References: <d59vll$4qf$1@sea.gmane.org><9768a4652f7ffa1778560b3548609827@xs4all.nl><427A3971.8030400@gmail.com>
	<427ADEDA.1050205@canterbury.ac.nz><ca471dc205050607513f9c1c9b@mail.gmail.com>
	<20050506153516.GA11224@phenix.progiciels-bpi.ca>
Message-ID: <d5g5q5$rd0$1@sea.gmane.org>

François Pinard wrote:

> It happens once in a while that I want to comment out the except clauses
> of a try statement, when I want the traceback of the inner raising, for
> debugging purposes.  Syntax forces me to also comment the `try:' line,
> and indent out the lines following the `try:' line.  And of course, the
> converse operation once debugging is done.  This is slightly heavy.

the standard pydiom for this is to change

    try:
        blabla
    except IOError:
        blabla

to

    try:
        blabla
    except "debug": # IOError:
        blabla

(to save typing, you can use an empty string or even
put quotes around the exception name, but that may
make it harder to spot the change)

</F>




From pje at telecommunity.com  Fri May  6 18:41:00 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Fri, 06 May 2005 12:41:00 -0400
Subject: [Python-Dev] The decorator module
In-Reply-To: <ca471dc205050607553b4bafed@mail.gmail.com>
References: <4edc17eb0505060741635ecde8@mail.gmail.com>
	<fb6fbf560505060730789906e2@mail.gmail.com>
	<4edc17eb0505060741635ecde8@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050506123920.020644e0@mail.telecommunity.com>

At 07:55 AM 5/6/2005 -0700, Guido van Rossum wrote:
>[jJ]
> > > Incidentally, would the resulting functions be a bit faster if you 
> compiled
> > > the lambda instead of repeatedly eval ing it, or does the eval 
> overhead still
> > > apply?
>
>[Michele]
> > Honestly, I don't care, since "eval" happens only once at decoration time.
> > There is no "eval" overhead at calling time, so I do not expect to have
> > problems. I am waiting for volunteers to perform profiling and
> > performance analysis ;)
>
>Watch out. I didn't see the code referred to, but realize that eval is
>*very* expensive on some other implementations of Python (Jython and
>IronPython). Eval should only be used if there is actual user-provided
>input that you don't know yet when your module is compiled; not to get
>around some limitation in the language there are usually ways around
>that, and occasionally we add one, e.g. getattr()).

In this case, the informally-discussed proposal is to add a mutable 
__signature__ to functions, and have it be used by inspect.getargspec(), so 
that decorators can copy __signature__ from the decoratee to the decorated 
function.


From fredrik at pythonware.com  Fri May  6 18:36:57 2005
From: fredrik at pythonware.com (Fredrik Lundh)
Date: Fri, 6 May 2005 18:36:57 +0200
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
References: <d59vll$4qf$1@sea.gmane.org><9768a4652f7ffa1778560b3548609827@xs4all.nl><427A3971.8030400@gmail.com>
	<427ADEDA.1050205@canterbury.ac.nz><ca471dc205050607513f9c1c9b@mail.gmail.com><20050506153516.GA11224@phenix.progiciels-bpi.ca><ca471dc2050506084130747b82@mail.gmail.com>
	<20050506160233.GA13176@phenix.progiciels-bpi.ca>
Message-ID: <d5g62f$sbq$1@sea.gmane.org>

François Pinard wrote:

> > >     f = None
> > >     try:
> > >         f = action1(...)
> > >     ...
> > >     finally:
> > >         if f is not None:
> > >             action2(f)

> > f = action1()
> > try:
> >     ...
> > finally:
> >     action2(f)

> > I can't see how this would ever do something different than your version.

> Oh, the problem is that if `action1()' raises an exception (and this is
> why it has to be within the `try', not before), `f' will not receive
> a value, and so, may not be initialised in all cases.  The (frequent)
> stunt is a guard so this never becomes a problem.

in Guido's solution, the "finally" clause won't be called at all if action1 raises
an exception.

</F>




From pje at telecommunity.com  Fri May  6 18:58:03 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Fri, 06 May 2005 12:58:03 -0400
Subject: [Python-Dev] PEP 340 - For loop cleanup, and feature  separation
In-Reply-To: <338366A6D2E2CA4C9DAEAE652E12A1DE025204EE@au3010avexu1.glob
	al.avaya.com>
Message-ID: <5.1.1.6.0.20050506124242.0236c778@mail.telecommunity.com>

At 01:58 PM 5/6/2005 +1000, Delaney, Timothy C (Timothy) wrote:
>Personally, I'm of the opinion that we should make a significant break
>(no pun intended ;) and have for-loops attempt to ensure that iterators
>are exhausted.

This is simply not backward compatible with existing, perfectly valid and 
sensible code.
Therefore, this can't happen till Py3K.

The only way I could see to allow this is if:

1. Calling __iter__ on the target of the for loop returns the same object
2. The for loop owns the only reference to that iterator.

However, #2 is problematic for non-CPython implementations, and in any case 
the whole thing seems terribly fragile.

So how about this: calling __exit__(StopIteration) on a generator that 
doesn't have any active blocks could simply *not* exhaust the 
iterator.  This would ensure that any iterator whose purpose is just 
iteration (i.e. all generators written to date) still behave in a resumable 
fashion.

Ugh.  It's still fragile, though, as adding a block to an iterator will 
then make it behave differently.  It seems likely to provoke subtle errors, 
arguing again in favor of a complete separation between iteration and block 
protocols.


From gvanrossum at gmail.com  Fri May  6 19:06:36 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 6 May 2005 10:06:36 -0700
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <d5g40t$hle$1@sea.gmane.org>
References: <d59vll$4qf$1@sea.gmane.org>
	<9768a4652f7ffa1778560b3548609827@xs4all.nl>
	<427A3971.8030400@gmail.com> <427ADEDA.1050205@canterbury.ac.nz>
	<ca471dc205050607513f9c1c9b@mail.gmail.com>
	<20050506153516.GA11224@phenix.progiciels-bpi.ca>
	<ca471dc2050506084130747b82@mail.gmail.com>
	<d5g40t$hle$1@sea.gmane.org>
Message-ID: <ca471dc205050610066c2ffbe5@mail.gmail.com>

[me]
> > I can't see how this would ever do something different than your version.

[Reinhold]
> Well, in the original the call to action1 was wrapped in an additional try-except
> block.

Ah. Francois was misquoting it.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Fri May  6 19:07:53 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 6 May 2005 10:07:53 -0700
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <d5g5q5$rd0$1@sea.gmane.org>
References: <d59vll$4qf$1@sea.gmane.org>
	<9768a4652f7ffa1778560b3548609827@xs4all.nl>
	<427A3971.8030400@gmail.com> <427ADEDA.1050205@canterbury.ac.nz>
	<ca471dc205050607513f9c1c9b@mail.gmail.com>
	<20050506153516.GA11224@phenix.progiciels-bpi.ca>
	<d5g5q5$rd0$1@sea.gmane.org>
Message-ID: <ca471dc205050610075b49a824@mail.gmail.com>

[Fredrik]
> the standard pydiom for this is to change
> 
>     try:
>         blabla
>     except IOError:
>         blabla
> 
> to
> 
>     try:
>         blabla
>     except "debug": # IOError:
>         blabla
> 
> (to save typing, you can use an empty string or even
> put quotes around the exception name, but that may
> make it harder to spot the change)

Yeah, but that will stop working in Python 3.0. I like the solution
that puts a bare "raise" at the top of the except clause.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From steven.bethard at gmail.com  Fri May  6 19:40:07 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Fri, 6 May 2005 11:40:07 -0600
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340
In-Reply-To: <79990c6b05050601395211a722@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org>
	<79990c6b050504015762d004ac@mail.gmail.com>
	<4278D7D7.2040805@gmail.com>
	<d11dcfba050504083551bb0a1e@mail.gmail.com>
	<4279F909.6030206@gmail.com>
	<d11dcfba050505080585fa710@mail.gmail.com> <427A4422.4@gmail.com>
	<79990c6b05050512582e52a7af@mail.gmail.com>
	<d11dcfba05050514344a8f4517@mail.gmail.com>
	<79990c6b05050601395211a722@mail.gmail.com>
Message-ID: <d11dcfba0505061040d28ed38@mail.gmail.com>

On 5/6/05, Paul Moore <p.f.moore at gmail.com> wrote:
> > I don't think it "damages" any features.  Are there features you still
> > think the non-looping proposal removes?  (I'm not counting orthogonal
> > feautres like "continue EXPR" which could easily be added as an
> > entirely separate PEP.)
> 
> I *am* specifically referring to these "orthogonal" features. Removal
> of looping by modification of PEP 340 will do no such "damage", I
> agree - but removal by accepting an updated PEP 310, or a new PEP,
> *will* (unless the "entirely separate PEP" you mention is written and
> accepted along with the non-looping PEP - and I don't think that will
> happen).

So, just to make sure, if we had another PEP that contained from PEP 340[1]:
 * Specification: the __next__() Method
 * Specification: the next() Built-in Function
 * Specification: a Change to the 'for' Loop
 * Specification: the Extended 'continue' Statement
 * the yield-expression part of Specification: Generator Exit Handling
would that cover all the pieces you're concerned about?

I'd be willing to break these off into a separate PEP if people think
it's a good idea.  I've seen very few complaints about any of these
pieces of the proposal.  If possible, I'd like to see these things
approved now, so that the discussion could focus more directly on the
block-statement issues.

STeVe

[1] http://www.python.org/peps/pep-0340.html
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From gvanrossum at gmail.com  Fri May  6 19:45:28 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 6 May 2005 10:45:28 -0700
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340
In-Reply-To: <d11dcfba0505061040d28ed38@mail.gmail.com>
References: <20050503150510.GA13595@onegeek.org> <4278D7D7.2040805@gmail.com>
	<d11dcfba050504083551bb0a1e@mail.gmail.com>
	<4279F909.6030206@gmail.com>
	<d11dcfba050505080585fa710@mail.gmail.com> <427A4422.4@gmail.com>
	<79990c6b05050512582e52a7af@mail.gmail.com>
	<d11dcfba05050514344a8f4517@mail.gmail.com>
	<79990c6b05050601395211a722@mail.gmail.com>
	<d11dcfba0505061040d28ed38@mail.gmail.com>
Message-ID: <ca471dc2050506104520ed25bd@mail.gmail.com>

[Steven Bethard]
> So, just to make sure, if we had another PEP that contained from PEP 340[1]:
>  * Specification: the __next__() Method
>  * Specification: the next() Built-in Function
>  * Specification: a Change to the 'for' Loop
>  * Specification: the Extended 'continue' Statement
>  * the yield-expression part of Specification: Generator Exit Handling
> would that cover all the pieces you're concerned about?
> 
> I'd be willing to break these off into a separate PEP if people think
> it's a good idea.  I've seen very few complaints about any of these
> pieces of the proposal.  If possible, I'd like to see these things
> approved now, so that the discussion could focus more directly on the
> block-statement issues.

I don't think it's necessary to separate this out into a separate PEP;
that just seems busy-work. I agree these parts are orthogonal and
uncontroversial; a counter-PEP can suffice by stating that it's not
countering those items nor repeating them.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From python at rcn.com  Fri May  6 19:53:24 2005
From: python at rcn.com (Raymond Hettinger)
Date: Fri, 6 May 2005 13:53:24 -0400
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340
In-Reply-To: <ca471dc2050506104520ed25bd@mail.gmail.com>
Message-ID: <002001c55264$8065f560$11bd2c81@oemcomputer>

> > I'd be willing to break these off into a separate PEP if people
think
> > it's a good idea.  I've seen very few complaints about any of these
> > pieces of the proposal.  If possible, I'd like to see these things
> > approved now, so that the discussion could focus more directly on
the
> > block-statement issues.
> 
> I don't think it's necessary to separate this out into a separate PEP;
> that just seems busy-work. I agree these parts are orthogonal and
> uncontroversial; a counter-PEP can suffice by stating that it's not
> countering those items nor repeating them.

If someone volunteers to split it out for you, I think it would be
worthwhile.  Right now, the PEP is hard to swallow in one bite.
Improving its digestibility would be a big help when the PEP is offered
up to the tender mercies to comp.lang.python.


Raymond

From rrr at ronadam.com  Fri May  6 19:59:20 2005
From: rrr at ronadam.com (Ron Adam)
Date: Fri, 06 May 2005 13:59:20 -0400
Subject: [Python-Dev] PEP 340 - For loop cleanup, and feature  separation
In-Reply-To: <5.1.1.6.0.20050506124242.0236c778@mail.telecommunity.com>
References: <5.1.1.6.0.20050506124242.0236c778@mail.telecommunity.com>
Message-ID: <427BB078.7000404@ronadam.com>

Phillip J. Eby wrote:
> At 01:58 PM 5/6/2005 +1000, Delaney, Timothy C (Timothy) wrote:
> 
>>Personally, I'm of the opinion that we should make a significant break
>>(no pun intended ;) and have for-loops attempt to ensure that iterators
>>are exhausted.
> 
> 
> This is simply not backward compatible with existing, perfectly valid and 
> sensible code.
> Therefore, this can't happen till Py3K.
> 
> The only way I could see to allow this is if:
> 
> 1. Calling __iter__ on the target of the for loop returns the same object
> 2. The for loop owns the only reference to that iterator.
> 
> However, #2 is problematic for non-CPython implementations, and in any case 
> the whole thing seems terribly fragile.

Is it better to have:
	1. A single looping construct that does everything,
	2. or several more specialized loops that are distinct?

I think the second may be better for performance reasons. So it bay 
would better to just add a third loop construct just for iterators.

	a.  for-loop -->iterable sequences and lists only
         b.  while-loop --> bool evaluations only
         c.  do-loop --> iterators only

Choice c. could mimic a. and b. with an iterator when the situation 
requires a for-loop or while-loop with special handling.

Ron_Adam

From gvanrossum at gmail.com  Fri May  6 20:00:13 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 6 May 2005 11:00:13 -0700
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340
In-Reply-To: <002001c55264$8065f560$11bd2c81@oemcomputer>
References: <ca471dc2050506104520ed25bd@mail.gmail.com>
	<002001c55264$8065f560$11bd2c81@oemcomputer>
Message-ID: <ca471dc205050611006ceb5c00@mail.gmail.com>

[me]
> > I don't think it's necessary to separate this out into a separate PEP;
> > that just seems busy-work. I agree these parts are orthogonal and
> > uncontroversial; a counter-PEP can suffice by stating that it's not
> > countering those items nor repeating them.

[Raymond]
> If someone volunteers to split it out for you, I think it would be
> worthwhile.  Right now, the PEP is hard to swallow in one bite.
> Improving its digestibility would be a big help when the PEP is offered
> up to the tender mercies to comp.lang.python.

Well, I don't care so much about their tender mercies right now. I'm
not even sure that if we reach agreement on python-dev there's any
point in repeating the agony on c.l.py.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From steven.bethard at gmail.com  Fri May  6 20:16:47 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Fri, 6 May 2005 12:16:47 -0600
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340
In-Reply-To: <002001c55264$8065f560$11bd2c81@oemcomputer>
References: <ca471dc2050506104520ed25bd@mail.gmail.com>
	<002001c55264$8065f560$11bd2c81@oemcomputer>
Message-ID: <d11dcfba050506111622ea81e3@mail.gmail.com>

[Guido]
> I don't think it's necessary to separate this out into a separate PEP;
> that just seems busy-work. I agree these parts are orthogonal and
> uncontroversial; a counter-PEP can suffice by stating that it's not
> countering those items nor repeating them.

[Raymond]
> If someone volunteers to split it out for you, I think it would be
> worthwhile.  Right now, the PEP is hard to swallow in one bite.
> Improving its digestibility would be a big help when the PEP is offered
> up to the tender mercies to comp.lang.python.

Well, busy-work or not, I took the 20 minutes to split them up, so I
figured I might as well make them available.  It was actually really
easy to split them apart, and I think they both read better this way,
but I'm not sure my opinion counts for much here anyway. ;-)  (The
Enhanced Iterators PEP is first, the remainder of PEP 340 follows it.)

----------------------------------------------------------------------
PEP: XXX
Title: Enhanced Iterators
Version: 
Last-Modified: 
Author: Guido van Rossum
Status: Draft
Type: Standards Track
Content-Type: text/plain
Created: 6-May-2005
Post-History:

Introduction

    This PEP proposes a new iterator API that allows values to be
    passed into an iterator using "continue EXPR". These values are
    received in the iterator as an argument to the new __next__
    method, and can be accessed in a generator with a
    yield-expression.
    
    The content of this PEP is derived from the original content of
    PEP 340, broken off into its own PEP as the new iterator API is
    pretty much orthogonal from the anonymous block statement
    discussion.

Motivation and Summary

    ...

Use Cases

    See the Examples section near the end.

Specification: the __next__() Method

    A new method for iterators is proposed, called __next__().  It
    takes one optional argument, which defaults to None.  Calling the
    __next__() method without argument or with None is equivalent to
    using the old iterator API, next().  For backwards compatibility,
    it is recommended that iterators also implement a next() method as
    an alias for calling the __next__() method without an argument.

    The argument to the __next__() method may be used by the iterator
    as a hint on what to do next.

Specification: the next() Built-in Function

    This is a built-in function defined as follows:

        def next(itr, arg=None):
            nxt = getattr(itr, "__next__", None)
            if nxt is not None:
                return nxt(arg)
            if arg is None:
                return itr.next()
            raise TypeError("next() with arg for old-style iterator")

    This function is proposed because there is often a need to call
    the next() method outside a for-loop; the new API, and the
    backwards compatibility code, is too ugly to have to repeat in
    user code.

Specification: a Change to the 'for' Loop

    A small change in the translation of the for-loop is proposed.
    The statement

        for VAR1 in EXPR1:
            BLOCK1
        else:
            BLOCK2

    will be translated as follows:

        itr = iter(EXPR1)
        arg = None    # Set by "continue EXPR2", see below
        brk = False
        while True:
            try:
                VAR1 = next(itr, arg)
            except StopIteration:
                brk = True
                break
            arg = None
            BLOCK1
        if brk:
            BLOCK2

    (However, the variables 'itr' etc. are not user-visible and the
    built-in names used cannot be overridden by the user.)

Specification: the Extended 'continue' Statement

    In the translation of the for-loop, inside BLOCK1, the new syntax

        continue EXPR2

    is legal and is translated into

        arg = EXPR2
        continue

    (Where 'arg' references the corresponding hidden variable from the
    previous section.)

    This is also the case in the body of the block-statement proposed
    below.

    EXPR2 may contain commas; "continue 1, 2, 3" is equivalent to
    "continue (1, 2, 3)".

Specification: Generators and Yield-Expressions

    Generators will implement the new __next__() method API, as well
    as the old argument-less next() method which becomes an alias for
    calling __next__() without an argument.

    The yield-statement will be allowed to be used on the right-hand
    side of an assignment; in that case it is referred to as
    yield-expression.  The value of this yield-expression is None
    unless __next__() was called with an argument; see below.

    A yield-expression must always be parenthesized except when it
    occurs at the top-level expression on the right-hand side of an
    assignment.  So

        x = yield 42
        x = yield
        x = 12 + (yield 42)
        x = 12 + (yield)
        foo(yield 42)
        foo(yield)

    are all legal, but

        x = 12 + yield 42
        x = 12 + yield
        foo(yield 42, 12)
        foo(yield, 12)

    are all illegal.  (Some of the edge cases are motivated by the
    current legality of "yield 12, 42".)

    When __next__() is called with an argument that is not None, the
    yield-expression that it resumes will return the argument.  If it
    resumes a yield-statement, the value is ignored (this is similar
    to ignoring the value returned by a function call).  When the
    *initial* call to __next__() receives an argument that is not
    None, TypeError is raised; this is likely caused by some logic
    error.  When __next__() is called without an argument or with None
    as argument, and a yield-expression is resumed, the
    yield-expression returns None.

    Note: the syntactic extensions to yield make its use very similar
    to that in Ruby.  This is intentional.  Do note that in Python the
    block passes a value to the generator using "continue EXPR" rather
    than "return EXPR", and the underlying mechanism whereby control
    is passed between the generator and the block is completely
    different.  Blocks in Python are not compiled into thunks; rather,
    yield suspends execution of the generator's frame.  Some edge
    cases work differently; in Python, you cannot save the block for
    later use, and you cannot test whether there is a block or not.

Acknowledgements

    See Acknowledgements of PEP 340.

References

    ...

Copyright

    This document has been placed in the public domain.

----------------------------------------------------------------------
**********************************************************************
----------------------------------------------------------------------

PEP: 340
Title: Anonymous Block Statements
Version: $Revision: 1.24 $
Last-Modified: $Date: 2005/05/05 15:39:19 $
Author: Guido van Rossum
Status: Draft
Type: Standards Track
Content-Type: text/plain
Created: 27-Apr-2005
Post-History:

Introduction

    This PEP proposes a new type of compound statement which can be
    used for resource management purposes.  The new statement type
    is provisionally called the block-statement because the keyword
    to be used has not yet been chosen.

    This PEP competes with several other PEPs: PEP 288 (Generators
    Attributes and Exceptions; only the second part), PEP 310
    (Reliable Acquisition/Release Pairs), and PEP 325
    (Resource-Release Support for Generators).

    I should clarify that using a generator to "drive" a block
    statement is a separable proposal; with just the definition of
    the block statement from the PEP you could implement all the
    examples using a class (similar to example 6, which is easily
    turned into a template).

    But the key idea is using a generator to drive a block statement;
    the rest is elaboration.

Motivation and Summary

    (Thanks to Shane Hathaway -- Hi Shane!)

    Good programmers move commonly used code into reusable functions.
    Sometimes, however, patterns arise in the structure of the
    functions rather than the actual sequence of statements.  For
    example, many functions acquire a lock, execute some code specific
    to that function, and unconditionally release the lock.  Repeating
    the locking code in every function that uses it is error prone and
    makes refactoring difficult.

    Block statements provide a mechanism for encapsulating patterns of
    structure.  Code inside the block statement runs under the control
    of an object called a block iterator.  Simple block iterators
    execute code before and after the code inside the block statement.
    Block iterators also have the opportunity to execute the
    controlled code more than once (or not at all), catch exceptions,
    or receive data from the body of the block statement.

    A convenient way to write block iterators is to write a generator
    (PEP 255).  A generator looks a lot like a Python function, but
    instead of returning a value immediately, generators pause their
    execution at "yield" statements.  When a generator is used as a
    block iterator, the yield statement tells the Python interpreter
    to suspend the block iterator, execute the block statement body,
    and resume the block iterator when the body has executed.

    The Python interpreter behaves as follows when it encounters a
    block statement based on a generator.  First, the interpreter
    instantiates the generator and begins executing it.  The generator
    does setup work appropriate to the pattern it encapsulates, such
    as acquiring a lock, opening a file, starting a database
    transaction, or starting a loop.  Then the generator yields
    execution to the body of the block statement using a yield
    statement.  When the block statement body completes, raises an
    uncaught exception, or sends data back to the generator using a
    continue statement, the generator resumes.  At this point, the
    generator can either clean up and stop or yield again, causing the
    block statement body to execute again.  When the generator
    finishes, the interpreter leaves the block statement.

Use Cases

    See the Examples section near the end.

Specification: the __exit__() Method

    An optional new method for iterators is proposed, called
    __exit__().  It takes up to three arguments which correspond to
    the three "arguments" to the raise-statement: type, value, and
    traceback.  If all three arguments are None, sys.exc_info() may be
    consulted to provide suitable default values.

Specification: the Anonymous Block Statement

    A new statement is proposed with the syntax

        block EXPR1 as VAR1:
            BLOCK1

    Here, 'block' and 'as' are new keywords; EXPR1 is an arbitrary
    expression (but not an expression-list) and VAR1 is an arbitrary
    assignment target (which may be a comma-separated list).

    The "as VAR1" part is optional; if omitted, the assignments to
    VAR1 in the translation below are omitted (but the expressions
    assigned are still evaluated!).

    The choice of the 'block' keyword is contentious; many
    alternatives have been proposed, including not to use a keyword at
    all (which I actually like).  PEP 310 uses 'with' for similar
    semantics, but I would like to reserve that for a with-statement
    similar to the one found in Pascal and VB.  (Though I just found
    that the C# designers don't like 'with' [2], and I have to agree
    with their reasoning.)  To sidestep this issue momentarily I'm
    using 'block' until we can agree on the right keyword, if any.

    Note that the 'as' keyword is not contentious (it will finally be
    elevated to proper keyword status).

    Note that it is up to the iterator to decide whether a
    block-statement represents a loop with multiple iterations; in the
    most common use case BLOCK1 is executed exactly once.  To the
    parser, however, it is always a loop; break and continue return
    transfer to the block's iterator (see below for details).

    The translation is subtly different from a for-loop: iter() is
    not called, so EXPR1 should already be an iterator (not just an
    iterable); and the iterator is guaranteed to be notified when
    the block-statement is left, regardless if this is due to a
    break, return or exception:

        itr = EXPR1  # The iterator
        ret = False  # True if a return statement is active
        val = None   # Return value, if ret == True
        exc = None   # sys.exc_info() tuple if an exception is active
        while True:
            try:
                if exc:
                    ext = getattr(itr, "__exit__", None)
                    if ext is not None:
                        VAR1 = ext(*exc)   # May re-raise *exc
                    else:
                        raise exc[0], exc[1], exc[2]
                else:
                    VAR1 = itr.next()  # May raise StopIteration
            except StopIteration:
                if ret:
                    return val
                break
            try:
                ret = False
                val = exc = None
                BLOCK1
            except:
                exc = sys.exc_info()

    (Again, the variables and built-ins are hidden from the user.)

    Inside BLOCK1, the following special translations apply:

    - "break" is always legal; it is translated into:

        exc = (StopIteration, None, None)
        continue

    - "return EXPR3" is only legal when the block-statement is
      contained in a function definition; it is translated into:

        exc = (StopIteration, None, None)
        ret = True
        val = EXPR3
        continue

    The net effect is that break and return behave much the same as
    if the block-statement were a for-loop, except that the iterator
    gets a chance at resource cleanup before the block-statement is
    left, through the optional __exit__() method. The iterator also
    gets a chance if the block-statement is left through raising an
    exception.  If the iterator doesn't have an __exit__() method,
    there is no difference with a for-loop (except that a for-loop
    calls iter() on EXPR1).

    Note that a yield-statement in a block-statement is not treated
    differently.  It suspends the function containing the block
    *without* notifying the block's iterator.  The block's iterator
    is entirely unaware of this yield, since the local control flow
    doesn't actually leave the block. In other words, it is *not*
    like a break or return statement.  When the loop that was
    resumed by the yield calls next(), the block is resumed right
    after the yield.  The generator finalization semantics described
    below guarantee (within the limitations of all finalization
    semantics) that the block will be resumed eventually.

    Unlike the for-loop, the block-statement does not have an
    else-clause.  I think it would be confusing, and emphasize the
    "loopiness" of the block-statement, while I want to emphasize its
    *difference* from a for-loop.  In addition, there are several
    possible semantics for an else-clause, and only a very weak use
    case.

Specification: Generator Exit Handling

    Generators will implement the new __exit__() method API.

    Generators will be allowed to have a yield statement inside a
    try-finally statement.

    The expression argument to the yield-statement will become
    optional (defaulting to None).

    When __exit__() is called, the generator is resumed but at the
    point of the yield-statement the exception represented by the
    __exit__ argument(s) is raised.  The generator may re-raise this
    exception, raise another exception, or yield another value,
    except that if the exception passed in to __exit__() was
    StopIteration, it ought to raise StopIteration (otherwise the
    effect would be that a break is turned into continue, which is
    unexpected at least).  When the *initial* call resuming the
    generator is an __exit__() call instead of a next() call, the
    generator's execution is aborted and the exception is re-raised
    without passing control to the generator's body.

    When a generator that has not yet terminated is garbage-collected
    (either through reference counting or by the cyclical garbage
    collector), its __exit__() method is called once with
    StopIteration as its first argument.  Together with the
    requirement that a generator ought to raise StopIteration when
    __exit__() is called with StopIteration, this guarantees the
    eventual activation of any finally-clauses that were active when
    the generator was last suspended.  Of course, under certain
    circumstances the generator may never be garbage-collected.  This
    is no different than the guarantees that are made about finalizers
    (__del__() methods) of other objects.

Alternatives Considered and Rejected

    - Many alternatives have been proposed for 'block'.  I haven't
      seen a proposal for another keyword that I like better than
      'block' yet.  Alas, 'block' is also not a good choice; it is a
      rather popular name for variables, arguments and methods.
      Perhaps 'with' is the best choice after all?

    - Instead of trying to pick the ideal keyword, the block-statement
      could simply have the form:

        EXPR1 as VAR1:
            BLOCK1

      This is at first attractive because, together with a good choice
      of function names (like those in the Examples section below)
      used in EXPR1, it reads well, and feels like a "user-defined
      statement".  And yet, it makes me (and many others)
      uncomfortable; without a keyword the syntax is very "bland",
      difficult to look up in a manual (remember that 'as' is
      optional), and it makes the meaning of break and continue in the
      block-statement even more confusing.

    - Phillip Eby has proposed to have the block-statement use
      an entirely different API than the for-loop, to differentiate
      between the two.  A generator would have to be wrapped in a
      decorator to make it support the block API.  IMO this adds more
      complexity with very little benefit; and we can't relly deny
      that the block-statement is conceptually a loop -- it supports
      break and continue, after all.

    - This keeps getting proposed: "block VAR1 = EXPR1" instead of
      "block EXPR1 as VAR1".  That would be very misleading, since
      VAR1 does *not* get assigned the value of EXPR1; EXPR1 results
      in a generator which is assigned to an internal variable, and
      VAR1 is the value returned by successive calls to the __next__()
      method of that iterator.

    - Why not change the translation to apply iter(EXPR1)?  All the
      examples would continue to work.  But this makes the
      block-statement *more* like a for-loop, while the emphasis ought
      to be on the *difference* between the two.  Not calling iter()
      catches a bunch of misunderstandings, like using a sequence as
      EXPR1.

Comparison to Thunks

    Alternative semantics proposed for the block-statement turn the
    block into a thunk (an anonymous function that blends into the
    containing scope).

    The main advantage of thunks that I can see is that you can save
    the thunk for later, like a callback for a button widget (the
    thunk then becomes a closure).  You can't use a yield-based block
    for that (except in Ruby, which uses yield syntax with a
    thunk-based implementation).  But I have to say that I almost see
    this as an advantage: I think I'd be slightly uncomfortable seeing
    a block and not knowing whether it will be executed in the normal
    control flow or later.  Defining an explicit nested function for
    that purpose doesn't have this problem for me, because I already
    know that the 'def' keyword means its body is executed later.

    The other problem with thunks is that once we think of them as the
    anonymous functions they are, we're pretty much forced to say that
    a return statement in a thunk returns from the thunk rather than
    from the containing function.  Doing it any other way would cause
    major weirdness when the thunk were to survive its containing
    function as a closure (perhaps continuations would help, but I'm
    not about to go there :-).

    But then an IMO important use case for the resource cleanup
    template pattern is lost.  I routinely write code like this:

       def findSomething(self, key, default=None):
           self.lock.acquire()
           try:
                for item in self.elements:
                    if item.matches(key):
                        return item
                return default
           finally:
              self.lock.release()

    and I'd be bummed if I couldn't write this as:

       def findSomething(self, key, default=None):
           block locking(self.lock):
                for item in self.elements:
                    if item.matches(key):
                        return item
                return default

    This particular example can be rewritten using a break:

       def findSomething(self, key, default=None):
           block locking(self.lock):
                for item in self.elements:
                    if item.matches(key):
                        break
                else:
                    item = default
            return item

    but it looks forced and the transformation isn't always that easy;
    you'd be forced to rewrite your code in a single-return style
    which feels too restrictive.

    Also note the semantic conundrum of a yield in a thunk -- the only
    reasonable interpretation is that this turns the thunk into a
    generator!

    Greg Ewing believes that thunks "would be a lot simpler, doing
    just what is required without any jiggery pokery with exceptions
    and break/continue/return statements.  It would be easy to explain
    what it does and why it's useful."

    But in order to obtain the required local variable sharing between
    the thunk and the containing function, every local variable used
    or set in the thunk would have to become a 'cell' (our mechanism
    for sharing variables between nested scopes).  Cells slow down
    access compared to regular local variables: access involves an
    extra C function call (PyCell_Get() or PyCell_Set()).

    Perhaps not entirely coincidentally, the last example above
    (findSomething() rewritten to avoid a return inside the block)
    shows that, unlike for regular nested functions, we'll want
    variables *assigned to* by the thunk also to be shared with the
    containing function, even if they are not assigned to outside the
    thunk.

    Greg Ewing again: "generators have turned out to be more powerful,
    because you can have more than one of them on the go at once. Is
    there a use for that capability here?"

    I believe there are definitely uses for this; several people have
    already shown how to do asynchronous light-weight threads using
    generators (e.g. David Mertz quoted in PEP 288, and Fredrik
    Lundh[3]).

    And finally, Greg says: "a thunk implementation has the potential
    to easily handle multiple block arguments, if a suitable syntax
    could ever be devised. It's hard to see how that could be done in
    a general way with the generator implementation."

    However, the use cases for multiple blocks seem elusive.

    (Proposals have since been made to change the implementation of
    thunks to remove most of these objections, but the resulting
    semantics are fairly complex to explain and to implement, so IMO
    that defeats the purpose of using thunks in the first place.)

Examples

    1. A template for ensuring that a lock, acquired at the start of a
       block, is released when the block is left:

        def locking(lock):
            lock.acquire()
            try:
                yield
            finally:
                lock.release()

       Used as follows:

        block locking(myLock):
            # Code here executes with myLock held.  The lock is
            # guaranteed to be released when the block is left (even
            # if via return or by an uncaught exception).

    2. A template for opening a file that ensures the file is closed
       when the block is left:

        def opening(filename, mode="r"):
            f = open(filename, mode)
            try:
                yield f
            finally:
                f.close()

       Used as follows:

        block opening("/etc/passwd") as f:
            for line in f:
                print line.rstrip()

    3. A template for committing or rolling back a database
       transaction:

        def transactional(db):
            try:
                yield
            except:
                db.rollback()
                raise
            else:
                db.commit()

    4. A template that tries something up to n times:

        def auto_retry(n=3, exc=Exception):
            for i in range(n):
                try:
                    yield
                    return
                except exc, err:
                    # perhaps log exception here
                    continue
            raise # re-raise the exception we caught earlier

       Used as follows:

        block auto_retry(3, IOError):
            f = urllib.urlopen("http://python.org/peps/pep-0340.html")
            print f.read()

    5. It is possible to nest blocks and combine templates:

        def locking_opening(lock, filename, mode="r"):
            block locking(lock):
                block opening(filename) as f:
                    yield f

       Used as follows:

        block locking_opening(myLock, "/etc/passwd") as f:
            for line in f:
                print line.rstrip()

       (If this example confuses you, consider that it is equivalent
       to using a for-loop with a yield in its body in a regular
       generator which is invoking another iterator or generator
       recursively; see for example the source code for os.walk().)

    6. It is possible to write a regular iterator with the
       semantics of example 1:

        class locking:
           def __init__(self, lock):
               self.lock = lock
               self.state = 0
           def __next__(self, arg=None):
               # ignores arg
               if self.state:
                   assert self.state == 1
                   self.lock.release()
                   self.state += 1
                   raise StopIteration
               else:
                   self.lock.acquire()
                   self.state += 1
                   return None
           def __exit__(self, type, value=None, traceback=None):
               assert self.state in (0, 1, 2)
               if self.state == 1:
                   self.lock.release()
               raise type, value, traceback

       (This example is easily modified to implement the other
       examples; it shows how much simpler generators are for the same
       purpose.)

    7. Redirect stdout temporarily:

        def redirecting_stdout(new_stdout):
            save_stdout = sys.stdout
            try:
                sys.stdout = new_stdout
                yield
            finally:
                sys.stdout = save_stdout

       Used as follows:

        block opening(filename, "w") as f:
            block redirecting_stdout(f):
                print "Hello world"

    8. A variant on opening() that also returns an error condition:

        def opening_w_error(filename, mode="r"):
	    try:
	        f = open(filename, mode)
	    except IOError, err:
	        yield None, err
	    else:
                try:
		    yield f, None
		finally:
		    f.close()

       Used as follows:

        block opening_w_error("/etc/passwd", "a") as f, err:
            if err:
	        print "IOError:", err
	    else:
	        f.write("guido::0:0::/:/bin/sh\n")

Acknowledgements

    In no useful order: Alex Martelli, Barry Warsaw, Bob Ippolito,
    Brett Cannon, Brian Sabbey, Chris Ryland, Doug Landauer, Duncan
    Booth, Fredrik Lundh, Greg Ewing, Holger Krekel, Jason Diamond,
    Jim Jewett, Josiah Carlson, Ka-Ping Yee, Michael Chermside,
    Michael Hudson, Neil Schemenauer, Nick Coghlan, Paul Moore,
    Phillip Eby, Raymond Hettinger, Reinhold Birkenfeld, Samuele
    Pedroni, Shannon Behrens, Skip Montanaro, Steven Bethard, Terry
    Reedy, Tim Delaney, Aahz, and others.  Thanks all for the valuable
    contributions!

References

    [1] http://mail.python.org/pipermail/python-dev/2005-April/052821.html

    [2] http://msdn.microsoft.com/vcsharp/programming/language/ask/withstatement/

    [3] http://effbot.org/zone/asyncore-generators.htm


Copyright

    This document has been placed in the public domain.

From nbastin at opnet.com  Fri May  6 20:49:00 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Fri, 6 May 2005 14:49:00 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427B1A06.4010004@egenix.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>
	<42794ABD.2080405@hathawaymix.org>
	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
	<427B1A06.4010004@egenix.com>
Message-ID: <9772ff3ac8afbd8d4451968a065e281b@opnet.com>


On May 6, 2005, at 3:17 AM, M.-A. Lemburg wrote:

> You've got that wrong: Python let's you choose UCS-4 -
> UCS-2 is the default.
>
> Note that Python's Unicode codecs UTF-8 and UTF-16
> are surrogate aware and thus support non-BMP code points
> regardless of the build type: A UCS2-build of Python will
> store a non-BMP code point as UTF-16 surrogate pair in the
> Py_UNICODE buffer while a UCS4 build will store it as a
> single value. Decoding is surrogate aware too, so a UTF-16
> surrogate pair in a UCS2 build will get treated as single
> Unicode code point.

If this is the case, then we're clearly misleading users.  If the 
configure script says UCS-2, then as a user I would assume that 
surrogate pairs would *not* be encoded, because I chose UCS-2, and it 
doesn't support that.  I would assume that any UTF-16 string I would 
read would be transcoded into the internal type (UCS-2), and 
information would be lost.  If this is not the case, then what does the 
configure option mean?

--
Nick


From nbastin at opnet.com  Fri May  6 20:51:12 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Fri, 6 May 2005 14:51:12 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427B1BD6.1060206@egenix.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>
	<ed6dac3aa136985e60159713ff1d75ac@opnet.com>
	<427B1BD6.1060206@egenix.com>
Message-ID: <d6016a11df6560db06fc5184e6a873bc@opnet.com>


On May 6, 2005, at 3:25 AM, M.-A. Lemburg wrote:

> I don't see why you shouldn't use Py_UNICODE buffer directly.
> After all, the reason why we have that typedef is to make it
> possible to program against an abstract type - regardless of
> its size on the given platform.

Because the encoding of that buffer appears to be different depending 
on the configure options.  If that isn't true, then someone needs to 
change the doc, and the configure options.  Right now, it seems *very* 
clear that Py_UNICODE may either be UCS-2 or UCS-4 encoded if you read 
the configure help, and you can't use the buffer directly if the 
encoding is variable.  However, you seem to be saying that this isn't 
true.

--
Nick


From nbastin at opnet.com  Fri May  6 21:37:57 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Fri, 6 May 2005 15:37:57 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427B1A06.4010004@egenix.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>
	<42794ABD.2080405@hathawaymix.org>
	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
	<427B1A06.4010004@egenix.com>
Message-ID: <1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>


On May 6, 2005, at 3:17 AM, M.-A. Lemburg wrote:

> You've got that wrong: Python let's you choose UCS-4 -
> UCS-2 is the default.

No, that's not true.  Python lets you choose UCS-4 or UCS-2.  What the 
default is depends on your platform.  If you run raw configure, some 
systems will choose UCS-4, and some will choose UCS-2.  This is how the 
conversation came about in the first place - running ./configure on 
RHL9 gives you UCS-4.

--
Nick


From pinard at iro.umontreal.ca  Fri May  6 21:39:10 2005
From: pinard at iro.umontreal.ca (=?iso-8859-1?Q?Fran=E7ois?= Pinard)
Date: Fri, 6 May 2005 15:39:10 -0400
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <ca471dc205050610075b49a824@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>
	<9768a4652f7ffa1778560b3548609827@xs4all.nl>
	<427A3971.8030400@gmail.com> <427ADEDA.1050205@canterbury.ac.nz>
	<ca471dc205050607513f9c1c9b@mail.gmail.com>
	<20050506153516.GA11224@phenix.progiciels-bpi.ca>
	<d5g5q5$rd0$1@sea.gmane.org>
	<ca471dc205050610075b49a824@mail.gmail.com>
Message-ID: <20050506193910.GA14793@phenix.progiciels-bpi.ca>

[Guido van Rossum]

> I like the solution that puts a bare "raise" at the top of the except
> clause.

Yes.  Clean and simple enough.  Thanks all! :-)

-- 
Fran?ois Pinard   http://pinard.progiciels-bpi.ca

From foom at fuhm.net  Fri May  6 21:42:16 2005
From: foom at fuhm.net (James Y Knight)
Date: Fri, 6 May 2005 15:42:16 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <9772ff3ac8afbd8d4451968a065e281b@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>
	<42794ABD.2080405@hathawaymix.org>
	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
	<427B1A06.4010004@egenix.com>
	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>
Message-ID: <49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>

On May 6, 2005, at 2:49 PM, Nicholas Bastin wrote:
> If this is the case, then we're clearly misleading users.  If the
> configure script says UCS-2, then as a user I would assume that
> surrogate pairs would *not* be encoded, because I chose UCS-2, and it
> doesn't support that.  I would assume that any UTF-16 string I would
> read would be transcoded into the internal type (UCS-2), and
> information would be lost.  If this is not the case, then what does the
> configure option mean?

It means all the string operations treat strings as if they were UCS-2, 
but that in actuality, they are UTF-16. Same as the case in the windows 
APIs and Java. That is, all string operations are essentially broken, 
because they're operating on encoded bytes, not characters, but claim 
to be operating on characters.

James


From p.f.moore at gmail.com  Fri May  6 21:53:17 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Fri, 6 May 2005 20:53:17 +0100
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340
In-Reply-To: <d11dcfba050506111622ea81e3@mail.gmail.com>
References: <ca471dc2050506104520ed25bd@mail.gmail.com>
	<002001c55264$8065f560$11bd2c81@oemcomputer>
	<d11dcfba050506111622ea81e3@mail.gmail.com>
Message-ID: <79990c6b050506125337be4615@mail.gmail.com>

On 5/6/05, Steven Bethard <steven.bethard at gmail.com> wrote:
> Well, busy-work or not, I took the 20 minutes to split them up, so I
> figured I might as well make them available.  It was actually really
> easy to split them apart, and I think they both read better this way,
> but I'm not sure my opinion counts for much here anyway. ;-)  (The
> Enhanced Iterators PEP is first, the remainder of PEP 340 follows it.)

Thanks for doing this. I think you may well be right - the two pieces
feel more orthogonal like this (I haven't checked for dependencies,
I'm trusting your editing and Guido's original assertion that the
parts are independent).

> ----------------------------------------------------------------------
> PEP: XXX
> Title: Enhanced Iterators

Strawman question - as this is the "uncontroversial" bit, can this
part be accepted as it stands? :-)

Paul.

From mal at egenix.com  Fri May  6 22:02:03 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Fri, 06 May 2005 22:02:03 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>
	<1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>
Message-ID: <427BCD3B.1000201@egenix.com>

Nicholas Bastin wrote:
> On May 6, 2005, at 3:17 AM, M.-A. Lemburg wrote:
> 
> 
>>You've got that wrong: Python let's you choose UCS-4 -
>>UCS-2 is the default.
> 
> 
> No, that's not true.  Python lets you choose UCS-4 or UCS-2.  What the 
> default is depends on your platform.  If you run raw configure, some 
> systems will choose UCS-4, and some will choose UCS-2.  This is how the 
> conversation came about in the first place - running ./configure on 
> RHL9 gives you UCS-4.

Hmm, looking at the configure.in script, it seems you're right.
I wonder why this weird dependency on TCL was added. This was
certainly not intended (see the comment):

if test $enable_unicode = yes
then
  # Without any arguments, Py_UNICODE defaults to two-byte mode
  case "$have_ucs4_tcl" in
  yes) enable_unicode="ucs4"
       ;;
  *)   enable_unicode="ucs2"
       ;;
  esac
fi

The annotiation suggests that Martin added this.

Martin, could you please explain why the whole *Python system*
should depend on what Unicode type some installed *TCL system*
is using ? I fail to see the connection.

Thanks,
-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 06 2005)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From steven.bethard at gmail.com  Fri May  6 22:04:21 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Fri, 6 May 2005 14:04:21 -0600
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340
In-Reply-To: <79990c6b050506125337be4615@mail.gmail.com>
References: <ca471dc2050506104520ed25bd@mail.gmail.com>
	<002001c55264$8065f560$11bd2c81@oemcomputer>
	<d11dcfba050506111622ea81e3@mail.gmail.com>
	<79990c6b050506125337be4615@mail.gmail.com>
Message-ID: <d11dcfba05050613047e89d14b@mail.gmail.com>

On 5/6/05, Paul Moore <p.f.moore at gmail.com> wrote:
> On 5/6/05, Steven Bethard <steven.bethard at gmail.com> wrote:
> > PEP: XXX
> > Title: Enhanced Iterators
> 
> Strawman question - as this is the "uncontroversial" bit, can this
> part be accepted as it stands? :-)

FWIW, I'm +1 on this.  Enhanced Iterators
 * updates the iterator protocol to use .__next__() instead of .next()
 * introduces a new builtin next()
 * allows continue-statements to pass values to iterators
 * allows generators to receive values with a yield-expression
The first two are, I believe, how the iterator protocol probably
should have been in the first place.  The second two provide a simple
way of passing values to generators, something I got the impression
that the co-routiney people would like a lot.

STeVe
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From eric.nieuwland at xs4all.nl  Fri May  6 22:13:10 2005
From: eric.nieuwland at xs4all.nl (Eric Nieuwland)
Date: Fri, 6 May 2005 22:13:10 +0200
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <ca471dc205050607513f9c1c9b@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>
	<9768a4652f7ffa1778560b3548609827@xs4all.nl>
	<427A3971.8030400@gmail.com> <427ADEDA.1050205@canterbury.ac.nz>
	<ca471dc205050607513f9c1c9b@mail.gmail.com>
Message-ID: <3fc08ba39aa2628bdb245fe3032bf8cd@xs4all.nl>

Guido van Rossum wrote:
> try_stmt: 'try' ':' suite
>             (
>                 except_clause ':' suite)+
>                 ['else' ':' suite] ['finally' ':' suite]
>             |
>                 'finally' ':' suite
>             )
>
> There is no real complexity in this grammar, it's unambiguous, it's an
> easy enough job for the code generator, and it catches a certain class
> of mistakes (like mis-indenting some code).

Fair enough. Always nice to have some assistence from the system.

--eric


From jimjjewett at gmail.com  Fri May  6 22:17:55 2005
From: jimjjewett at gmail.com (Jim Jewett)
Date: Fri, 6 May 2005 16:17:55 -0400
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340
Message-ID: <fb6fbf5605050613175a5d47a6@mail.gmail.com>

Enhanced Iterators: 

...
> When the *initial* call to __next__() receives an argument 
> that is not None, TypeError is raised; this is likely caused
> by some logic error. 

This made sense when the (Block) Iterators were Resources,
and the first __next__() was just to trigger the setup.

It makes less sense for general iterators.

It is true that the first call in a generic for-loop couldn't 
pass a value (as it isn't continued), but I don't see anything
wrong with explicit calls to __next__.

Example:  An agent which responds to the environment;
the agent can execute multi-stage plans, or change its mind 
part way through.  

   action = scheduler.__next__(current_sensory_input)

-jJ

From gvanrossum at gmail.com  Fri May  6 22:18:24 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 6 May 2005 13:18:24 -0700
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340
In-Reply-To: <d11dcfba05050613047e89d14b@mail.gmail.com>
References: <ca471dc2050506104520ed25bd@mail.gmail.com>
	<002001c55264$8065f560$11bd2c81@oemcomputer>
	<d11dcfba050506111622ea81e3@mail.gmail.com>
	<79990c6b050506125337be4615@mail.gmail.com>
	<d11dcfba05050613047e89d14b@mail.gmail.com>
Message-ID: <ca471dc205050613184108b3d0@mail.gmail.com>

On 5/6/05, Steven Bethard <steven.bethard at gmail.com> wrote:
> On 5/6/05, Paul Moore <p.f.moore at gmail.com> wrote:
> > On 5/6/05, Steven Bethard <steven.bethard at gmail.com> wrote:
> > > PEP: XXX
> > > Title: Enhanced Iterators
> >
> > Strawman question - as this is the "uncontroversial" bit, can this
> > part be accepted as it stands? :-)
> 
> FWIW, I'm +1 on this.  Enhanced Iterators
>  * updates the iterator protocol to use .__next__() instead of .next()
>  * introduces a new builtin next()
>  * allows continue-statements to pass values to iterators
>  * allows generators to receive values with a yield-expression
> The first two are, I believe, how the iterator protocol probably
> should have been in the first place.  The second two provide a simple
> way of passing values to generators, something I got the impression
> that the co-routiney people would like a lot.

At the same time it pretty much affects *only* the co-routiney people,
so there's no hurry. I'd be happy with PEP 340 without all this too. I
think one reason it ended up in that PEP is that an earlier version of
the PEP called __next__() with an exception argument instead of having
a separate__exit__() API.

There's one alternative possible (still orthogonal to PEP 340):
instead of __next__(), we could add an optional argument to the next()
method, and forget about the next() built-in. This is more compatible
(if less future-proof). Old iterators would raise an exception when
their next() is called with an argument, and this would be a
reasonable way to find out that you're using "continue EXPR" with an
iterator that doesn't support it. (The C level API would be a bit
hairier but it can all be done in a compatible way.)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From nbastin at opnet.com  Fri May  6 22:20:39 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Fri, 6 May 2005 16:20:39 -0400
Subject: [Python-Dev] New Py_UNICODE doc (Another Attempt)
In-Reply-To: <49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>
	<42794ABD.2080405@hathawaymix.org>
	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
	<427B1A06.4010004@egenix.com>
	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>
	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
Message-ID: <80dee83acea5f90578c6e42e3c78bb03@opnet.com>

After reading through the code and the comments in this thread, I 
propose the following in the documentation as the definition of 
Py_UNICODE:

"This type represents the storage type which is used by Python 
internally as the basis for holding Unicode ordinals.  Extension module 
developers should make no assumptions about the size or native encoding 
of this type on any given platform."

The main point here is that extension developers can not safely slam 
Py_UNICODE (which it appeared was true when the documentation stated 
that it was always 16-bits).

I don't propose that we put this information in the doc, but the 
possible internal representations are:

2-byte wchar_t or unsigned short encoded as UTF-16
4-byte wchar_t encoded as UTF-32 (UCS-4)

If you do not explicitly set the configure option, you cannot guarantee 
which you will get.  Python also does not normalize the byte order of 
unicode strings passed into it from C (via PyUnicode_EncodeUTF16, for 
example), so it is possible to have UTF-16LE and UTF-16BE strings in 
the system at the same time, which is a bit confusing.  This may or may 
not be worth a mention in the doc (or a patch).

--
Nick


From nbastin at opnet.com  Fri May  6 22:21:53 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Fri, 6 May 2005 16:21:53 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>
	<42794ABD.2080405@hathawaymix.org>
	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
	<427B1A06.4010004@egenix.com>
	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>
	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
Message-ID: <a39457493d9660bee8d1ace89067c990@opnet.com>


On May 6, 2005, at 3:42 PM, James Y Knight wrote:

> On May 6, 2005, at 2:49 PM, Nicholas Bastin wrote:
>> If this is the case, then we're clearly misleading users.  If the
>> configure script says UCS-2, then as a user I would assume that
>> surrogate pairs would *not* be encoded, because I chose UCS-2, and it
>> doesn't support that.  I would assume that any UTF-16 string I would
>> read would be transcoded into the internal type (UCS-2), and
>> information would be lost.  If this is not the case, then what does 
>> the
>> configure option mean?
>
> It means all the string operations treat strings as if they were 
> UCS-2, but that in actuality, they are UTF-16. Same as the case in the 
> windows APIs and Java. That is, all string operations are essentially 
> broken, because they're operating on encoded bytes, not characters, 
> but claim to be operating on characters.

Well, this is a completely separate issue/problem. The internal 
representation is UTF-16, and should be stated as such.  If the 
built-in methods actually don't work with surrogate pairs, then that 
should be fixed.

--
Nick


From gvanrossum at gmail.com  Fri May  6 22:31:58 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 6 May 2005 13:31:58 -0700
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340
In-Reply-To: <fb6fbf5605050613175a5d47a6@mail.gmail.com>
References: <fb6fbf5605050613175a5d47a6@mail.gmail.com>
Message-ID: <ca471dc205050613317db30a72@mail.gmail.com>

> Enhanced Iterators:
> 
> ...
> > When the *initial* call to __next__() receives an argument
> > that is not None, TypeError is raised; this is likely caused
> > by some logic error.

[Jim Jewett]
> This made sense when the (Block) Iterators were Resources,
> and the first __next__() was just to trigger the setup.
> 
> It makes less sense for general iterators.
> 
> It is true that the first call in a generic for-loop couldn't
> pass a value (as it isn't continued), but I don't see anything
> wrong with explicit calls to __next__.
> 
> Example:  An agent which responds to the environment;
> the agent can execute multi-stage plans, or change its mind
> part way through.
> 
>    action = scheduler.__next__(current_sensory_input)

Good point. I'd be happy if the requirement that the first __next__()
call doesn't have an argument (or that it's None) only applies to
generators, and not to iterators in general.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From shane at hathawaymix.org  Fri May  6 23:21:56 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Fri, 06 May 2005 15:21:56 -0600
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <a39457493d9660bee8d1ace89067c990@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
	<a39457493d9660bee8d1ace89067c990@opnet.com>
Message-ID: <427BDFF4.9030900@hathawaymix.org>

Nicholas Bastin wrote:
> On May 6, 2005, at 3:42 PM, James Y Knight wrote:
>>It means all the string operations treat strings as if they were 
>>UCS-2, but that in actuality, they are UTF-16. Same as the case in the 
>>windows APIs and Java. That is, all string operations are essentially 
>>broken, because they're operating on encoded bytes, not characters, 
>>but claim to be operating on characters.
> 
> 
> Well, this is a completely separate issue/problem. The internal 
> representation is UTF-16, and should be stated as such.  If the 
> built-in methods actually don't work with surrogate pairs, then that 
> should be fixed.

Wait... are you saying a Py_UNICODE array contains either UTF-16 or
UTF-32 characters, but never UCS-2?  That's a big surprise to me.  I may
need to change my PyXPCOM patch to fit this new understanding.  I tried
hard to not care how Python encodes unicode characters, but details like
this are important when combining two frameworks with different unicode
APIs.

Shane

From fredrik at pythonware.com  Sat May  7 00:00:17 2005
From: fredrik at pythonware.com (Fredrik Lundh)
Date: Sat, 7 May 2005 00:00:17 +0200
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
References: <d59vll$4qf$1@sea.gmane.org><9768a4652f7ffa1778560b3548609827@xs4all.nl><427A3971.8030400@gmail.com>
	<427ADEDA.1050205@canterbury.ac.nz><ca471dc205050607513f9c1c9b@mail.gmail.com><20050506153516.GA11224@phenix.progiciels-bpi.ca><d5g5q5$rd0$1@sea.gmane.org>
	<ca471dc205050610075b49a824@mail.gmail.com>
Message-ID: <d5gp0m$s7f$1@sea.gmane.org>

Guido van Rossum wrote:

> > (to save typing, you can use an empty string or even
> > put quotes around the exception name, but that may
> > make it harder to spot the change)
>
> Yeah, but that will stop working in Python 3.0.

well, I tend to remove my debugging hacks once I've fixed
the bug.  I definitely don't expect them to be compatible with
hypothetical future releases...

</F>




From pje at telecommunity.com  Sat May  7 00:24:13 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Fri, 06 May 2005 18:24:13 -0400
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP  340
In-Reply-To: <ca471dc205050613184108b3d0@mail.gmail.com>
References: <d11dcfba05050613047e89d14b@mail.gmail.com>
	<ca471dc2050506104520ed25bd@mail.gmail.com>
	<002001c55264$8065f560$11bd2c81@oemcomputer>
	<d11dcfba050506111622ea81e3@mail.gmail.com>
	<79990c6b050506125337be4615@mail.gmail.com>
	<d11dcfba05050613047e89d14b@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050506182302.02066c60@mail.telecommunity.com>

At 01:18 PM 5/6/2005 -0700, Guido van Rossum wrote:
>There's one alternative possible (still orthogonal to PEP 340):
>instead of __next__(), we could add an optional argument to the next()
>method, and forget about the next() built-in. This is more compatible
>(if less future-proof). Old iterators would raise an exception when
>their next() is called with an argument, and this would be a
>reasonable way to find out that you're using "continue EXPR" with an
>iterator that doesn't support it. (The C level API would be a bit
>hairier but it can all be done in a compatible way.)

+1.


From python-dev at zesty.ca  Sat May  7 00:30:22 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Fri, 6 May 2005 17:30:22 -0500 (CDT)
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340
In-Reply-To: <ca471dc205050613184108b3d0@mail.gmail.com>
References: <ca471dc2050506104520ed25bd@mail.gmail.com>
	<002001c55264$8065f560$11bd2c81@oemcomputer>
	<d11dcfba050506111622ea81e3@mail.gmail.com>
	<79990c6b050506125337be4615@mail.gmail.com>
	<d11dcfba05050613047e89d14b@mail.gmail.com>
	<ca471dc205050613184108b3d0@mail.gmail.com>
Message-ID: <Pine.LNX.4.58.0505061729120.4786@server1.LFW.org>

On Fri, 6 May 2005, Guido van Rossum wrote:
> There's one alternative possible (still orthogonal to PEP 340):
> instead of __next__(), we could add an optional argument to the next()
> method, and forget about the next() built-in.

I prefer your original proposal.  I think this is a good time to switch
to next().  If we are going to change the protocol, let's do it right.


-- ?!ng

From nbastin at opnet.com  Sat May  7 00:53:24 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Fri, 6 May 2005 18:53:24 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427BDFF4.9030900@hathawaymix.org>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
	<a39457493d9660bee8d1ace89067c990@opnet.com>
	<427BDFF4.9030900@hathawaymix.org>
Message-ID: <851aac706e3951c4c7c5e6e5467eafff@opnet.com>


On May 6, 2005, at 5:21 PM, Shane Hathaway wrote:

> Nicholas Bastin wrote:
>> On May 6, 2005, at 3:42 PM, James Y Knight wrote:
>>> It means all the string operations treat strings as if they were
>>> UCS-2, but that in actuality, they are UTF-16. Same as the case in 
>>> the
>>> windows APIs and Java. That is, all string operations are essentially
>>> broken, because they're operating on encoded bytes, not characters,
>>> but claim to be operating on characters.
>>
>>
>> Well, this is a completely separate issue/problem. The internal
>> representation is UTF-16, and should be stated as such.  If the
>> built-in methods actually don't work with surrogate pairs, then that
>> should be fixed.
>
> Wait... are you saying a Py_UNICODE array contains either UTF-16 or
> UTF-32 characters, but never UCS-2?  That's a big surprise to me.  I 
> may
> need to change my PyXPCOM patch to fit this new understanding.  I tried
> hard to not care how Python encodes unicode characters, but details 
> like
> this are important when combining two frameworks with different unicode
> APIs.

Yes.  Well, in as much as a large part of UTF-16 directly overlaps 
UCS-2, then sometimes unicode strings contain UCS-2 characters.  
However, characters which would not be legal in UCS-2 are still encoded 
properly in python, in UTF-16.

And yes, I feel your pain, that's how I *got* into this position.  
Mapping from external unicode types is an important aspect of writing 
extension modules, and the documentation does not help people trying to 
do this.  The fact that python's internal encoding is variable is a 
huge problem in and of itself, even if that was documented properly.  
This is why tools like Xerces and ICU will be happy to give you 
whatever form of unicode strings you want, but internally they always 
use UTF-16 - to avoid having to write two internal implementations of 
the same functionality.  If you look up and down 
Objects/unicodeobject.c you'll see a fair amount of code written a 
couple of different ways (using #ifdef's) because of the variability in 
the internal representation.

--
Nick


From shane at hathawaymix.org  Sat May  7 01:05:38 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Fri, 06 May 2005 17:05:38 -0600
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <851aac706e3951c4c7c5e6e5467eafff@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
	<a39457493d9660bee8d1ace89067c990@opnet.com>
	<427BDFF4.9030900@hathawaymix.org>
	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>
Message-ID: <427BF842.8060604@hathawaymix.org>

Nicholas Bastin wrote:
> 
> On May 6, 2005, at 5:21 PM, Shane Hathaway wrote:
>> Wait... are you saying a Py_UNICODE array contains either UTF-16 or
>> UTF-32 characters, but never UCS-2?  That's a big surprise to me.  I may
>> need to change my PyXPCOM patch to fit this new understanding.  I tried
>> hard to not care how Python encodes unicode characters, but details like
>> this are important when combining two frameworks with different unicode
>> APIs.
> 
> 
> Yes.  Well, in as much as a large part of UTF-16 directly overlaps
> UCS-2, then sometimes unicode strings contain UCS-2 characters. 
> However, characters which would not be legal in UCS-2 are still encoded
> properly in python, in UTF-16.
> 
> And yes, I feel your pain, that's how I *got* into this position. 
> Mapping from external unicode types is an important aspect of writing
> extension modules, and the documentation does not help people trying to
> do this.  The fact that python's internal encoding is variable is a huge
> problem in and of itself, even if that was documented properly.  This is
> why tools like Xerces and ICU will be happy to give you whatever form of
> unicode strings you want, but internally they always use UTF-16 - to
> avoid having to write two internal implementations of the same
> functionality.  If you look up and down Objects/unicodeobject.c you'll
> see a fair amount of code written a couple of different ways (using
> #ifdef's) because of the variability in the internal representation.

Ok.  Thanks for helping me understand where Python is WRT unicode.  I
can work around the issues (or maybe try to help solve them) now that I
know the current state of affairs.  If Python correctly handled UTF-16
strings internally, we wouldn't need the UCS-4 configuration switch,
would we?

Shane

From martin at v.loewis.de  Sat May  7 01:35:08 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 01:35:08 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <ed6dac3aa136985e60159713ff1d75ac@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>
	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>
	<427946B9.6070500@v.loewis.de>
	<ed6dac3aa136985e60159713ff1d75ac@opnet.com>
Message-ID: <427BFF2C.3020803@v.loewis.de>

Nicholas Bastin wrote:
> The important piece of information is that it is not guaranteed to be a
> particular one of those sizes.  Once you can't guarantee the size, no
> one really cares what size it is.

Please trust many years of experience: This is just not true. People
do care, and they want to know. If we tell them "it depends", they
ask "how can I find out".

> The documentation should discourage
> developers from attempting to manipulate Py_UNICODE directly, which,
> other than trivia, is the only reason why someone would care what size
> the internal representation is.

Why is that? Of *course* people will have to manipulate Py_UNICODE*
buffers directly. What else can they use?

Regards,
Martin

From martin at v.loewis.de  Sat May  7 01:37:37 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 01:37:37 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>
	<427946B9.6070500@v.loewis.de> <42794ABD.2080405@hathawaymix.org>
	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
Message-ID: <427BFFC1.6040202@v.loewis.de>

Nicholas Bastin wrote:
> I'm not sure the Python documentation is the place to teach someone
> about unicode.  The ISO 10646 pretty clearly defines UCS-2 as only
> containing characters in the BMP (plane zero).  On the other hand, I
> don't know why python lets you choose UCS-2 anyhow, since it's almost
> always not what you want.

It certainly is, in most cases. On Windows, it is the only way to
get reasonable interoperability with the platform's WCHAR (i.e.
just cast a Py_UNICODE* into a WCHAR*).

To a limited degree, in UCS-2 mode, Python has support for surrogate
characters (e.g. in UTF-8 codec), so it is not "pure" UCS-2, but
this is a minor issue.

Regards,
Martin

From bob at redivi.com  Sat May  7 01:40:01 2005
From: bob at redivi.com (Bob Ippolito)
Date: Fri, 6 May 2005 19:40:01 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427BF842.8060604@hathawaymix.org>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
	<a39457493d9660bee8d1ace89067c990@opnet.com>
	<427BDFF4.9030900@hathawaymix.org>
	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>
	<427BF842.8060604@hathawaymix.org>
Message-ID: <FD81A8FF-E572-4D1F-9873-DE3576E7AE89@redivi.com>

On May 6, 2005, at 7:05 PM, Shane Hathaway wrote:

> Nicholas Bastin wrote:
>
>> On May 6, 2005, at 5:21 PM, Shane Hathaway wrote:
>>
>>> Wait... are you saying a Py_UNICODE array contains either UTF-16 or
>>> UTF-32 characters, but never UCS-2?  That's a big surprise to  
>>> me.  I may
>>> need to change my PyXPCOM patch to fit this new understanding.  I  
>>> tried
>>> hard to not care how Python encodes unicode characters, but  
>>> details like
>>> this are important when combining two frameworks with different  
>>> unicode
>>> APIs.
>>
>> Yes.  Well, in as much as a large part of UTF-16 directly overlaps
>> UCS-2, then sometimes unicode strings contain UCS-2 characters.
>> However, characters which would not be legal in UCS-2 are still  
>> encoded
>> properly in python, in UTF-16.
>>
>> And yes, I feel your pain, that's how I *got* into this position.
>> Mapping from external unicode types is an important aspect of writing
>> extension modules, and the documentation does not help people  
>> trying to
>> do this.  The fact that python's internal encoding is variable is  
>> a huge
>> problem in and of itself, even if that was documented properly.   
>> This is
>> why tools like Xerces and ICU will be happy to give you whatever  
>> form of
>> unicode strings you want, but internally they always use UTF-16 - to
>> avoid having to write two internal implementations of the same
>> functionality.  If you look up and down Objects/unicodeobject.c  
>> you'll
>> see a fair amount of code written a couple of different ways (using
>> #ifdef's) because of the variability in the internal representation.
>>
>
> Ok.  Thanks for helping me understand where Python is WRT unicode.  I
> can work around the issues (or maybe try to help solve them) now  
> that I
> know the current state of affairs.  If Python correctly handled UTF-16
> strings internally, we wouldn't need the UCS-4 configuration switch,
> would we?

Personally I would rather see Python (3000) grow a new way to  
represent strings, more along the lines of the way it's typically  
done in Objective-C.  I wrote a little bit about that works here:

http://bob.pythonmac.org/archives/2005/04/04/pyobjc-and-unicode/

Effectively, instead of having One And Only One Way To Store Text,  
you would have one and only one base class (say basestring) that has  
some "virtual" methods that know how to deal with text.  Then, you  
have several concrete implementations that implements those functions  
for its particular backing store (and possibly encoding, but that  
might be implicit with the backing store.. i.e. if its an ASCII,  
UCS-2 or UCS-4 backing store).  Currently we more or less have this  
at the Python level, between str and unicode, but certainly not at  
the C API.

-bob


From martin at v.loewis.de  Sat May  7 01:40:40 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 01:40:40 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427AB26B.2040004@hathawaymix.org>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>
	<427946B9.6070500@v.loewis.de> <42794ABD.2080405@hathawaymix.org>
	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
	<427AB26B.2040004@hathawaymix.org>
Message-ID: <427C0078.3090806@v.loewis.de>

Shane Hathaway wrote:
> Then something in the Python docs ought to say why UCS-2 is not what you
> want.  I still don't know; I've heard differing opinions on the subject.
>  Some say you'll never need more than what UCS-2 provides.  Is that
> incorrect?

That clearly depends on who "you" is.

> More generally, how should a non-unicode-expert writing Python extension
> code find out the minimum they need to know about unicode to use the
> Python unicode API?  The API reference [1] ought to at least have a list
> of background links.  I had to hunt everywhere.

That, of course, depends on what your background is. Did you know what
Latin-1 is, when you started? How it relates to code page 1252? What
UTF-8 is? What an abstract character is, as opposed to a byte sequence
on the one hand, and to a glyph on the other hand?

Different people need different background, especially if they are
writing different applications.

Regards,
Martin

From martin at v.loewis.de  Sat May  7 01:43:15 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 01:43:15 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <9772ff3ac8afbd8d4451968a065e281b@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>
	<42794ABD.2080405@hathawaymix.org>
	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
	<427B1A06.4010004@egenix.com>
	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>
Message-ID: <427C0113.7040203@v.loewis.de>

Nicholas Bastin wrote:
> If this is the case, then we're clearly misleading users.  If the
> configure script says UCS-2, then as a user I would assume that
> surrogate pairs would *not* be encoded, because I chose UCS-2, and it
> doesn't support that.

What do you mean by that? That the interpreter crashes if you try
to store a low surrogate into a Py_UNICODE?

> I would assume that any UTF-16 string I would
> read would be transcoded into the internal type (UCS-2), and information
> would be lost.  If this is not the case, then what does the configure
> option mean?

It tells you whether you have the two-octet form of the Universal
Character Set, or the four-octet form.

Regards,
Martin

From martin at v.loewis.de  Sat May  7 01:45:24 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 01:45:24 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <d6016a11df6560db06fc5184e6a873bc@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>
	<ed6dac3aa136985e60159713ff1d75ac@opnet.com>
	<427B1BD6.1060206@egenix.com>
	<d6016a11df6560db06fc5184e6a873bc@opnet.com>
Message-ID: <427C0194.3000008@v.loewis.de>

Nicholas Bastin wrote:
> Because the encoding of that buffer appears to be different depending on
> the configure options.

What makes it appear so? sizeof(Py_UNICODE) changes when you change
the option - does that, in your mind, mean that the encoding changes?

> If that isn't true, then someone needs to change
> the doc, and the configure options.  Right now, it seems *very* clear
> that Py_UNICODE may either be UCS-2 or UCS-4 encoded if you read the
> configure help, and you can't use the buffer directly if the encoding is
> variable.  However, you seem to be saying that this isn't true.

It's a compile-time option (as all configure options). So at run-time,
it isn't variable.

Regards,
Martin

From martin at v.loewis.de  Sat May  7 01:48:04 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 01:48:04 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>
	<42794ABD.2080405@hathawaymix.org>
	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
	<427B1A06.4010004@egenix.com>
	<1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>
Message-ID: <427C0234.5000408@v.loewis.de>

Nicholas Bastin wrote:
> No, that's not true.  Python lets you choose UCS-4 or UCS-2.  What the
> default is depends on your platform.

The truth is more complicated. If your Tcl is built for UCS-4, then
Python will also be built for UCS-4 (unless overridden by command line).
Otherwise, Python will default to UCS-2.

Regards,
Martin

From martin at v.loewis.de  Sat May  7 01:49:49 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 01:49:49 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427BCD3B.1000201@egenix.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>
	<1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>
	<427BCD3B.1000201@egenix.com>
Message-ID: <427C029D.3090907@v.loewis.de>

M.-A. Lemburg wrote:
> Hmm, looking at the configure.in script, it seems you're right.
> I wonder why this weird dependency on TCL was added.

If Python is configured for UCS-2, and Tcl for UCS-4, then
Tkinter would not work out of the box. Hence the weird dependency.

Regards,
Martin

From nbastin at opnet.com  Sat May  7 02:01:50 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Fri, 6 May 2005 20:01:50 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427C0113.7040203@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>
	<42794ABD.2080405@hathawaymix.org>
	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
	<427B1A06.4010004@egenix.com>
	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>
	<427C0113.7040203@v.loewis.de>
Message-ID: <76f88e8be4003929d4944c357b041b96@opnet.com>


On May 6, 2005, at 7:43 PM, Martin v. L?wis wrote:

> Nicholas Bastin wrote:
>> If this is the case, then we're clearly misleading users.  If the
>> configure script says UCS-2, then as a user I would assume that
>> surrogate pairs would *not* be encoded, because I chose UCS-2, and it
>> doesn't support that.
>
> What do you mean by that? That the interpreter crashes if you try
> to store a low surrogate into a Py_UNICODE?

What I mean is pretty clear.  UCS-2 does *NOT* support surrogate pairs. 
  If it did, it would be called UTF-16.  If Python really supported 
UCS-2, then surrogate pairs from UTF-16 inputs would either get turned 
into two garbage characters, or the "I couldn't transcode this" UCS-2 
code point (I don't remember which on that is off the top of my head).

>> I would assume that any UTF-16 string I would
>> read would be transcoded into the internal type (UCS-2), and 
>> information
>> would be lost.  If this is not the case, then what does the configure
>> option mean?
>
> It tells you whether you have the two-octet form of the Universal
> Character Set, or the four-octet form.

It would, if that were the case, but it's not.  Setting UCS-2 in the 
configure script really means UTF-16, and as such, the documentation 
should reflect that.

--
Nick


From nbastin at opnet.com  Sat May  7 02:06:55 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Fri, 6 May 2005 20:06:55 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427C0194.3000008@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>
	<ed6dac3aa136985e60159713ff1d75ac@opnet.com>
	<427B1BD6.1060206@egenix.com>
	<d6016a11df6560db06fc5184e6a873bc@opnet.com>
	<427C0194.3000008@v.loewis.de>
Message-ID: <2ee14995c7e9d4fb650fbe3844d35dcb@opnet.com>


On May 6, 2005, at 7:45 PM, Martin v. L?wis wrote:

> Nicholas Bastin wrote:
>> Because the encoding of that buffer appears to be different depending 
>> on
>> the configure options.
>
> What makes it appear so? sizeof(Py_UNICODE) changes when you change
> the option - does that, in your mind, mean that the encoding changes?

Yes.  Not only in my mind, but in the Python source code.  If 
Py_UNICODE is 4 bytes wide, then the encoding is UTF-32 (UCS-4), 
otherwise the encoding is UTF-16 (*not* UCS-2).

>> If that isn't true, then someone needs to change
>> the doc, and the configure options.  Right now, it seems *very* clear
>> that Py_UNICODE may either be UCS-2 or UCS-4 encoded if you read the
>> configure help, and you can't use the buffer directly if the encoding 
>> is
>> variable.  However, you seem to be saying that this isn't true.
>
> It's a compile-time option (as all configure options). So at run-time,
> it isn't variable.

What I mean by 'variable' is that you can't make any assumption as to 
what the size will be in any given python when you're writing (and 
building) an extension module.  This breaks binary compatibility of 
extensions modules on the same platform and same version of python 
across interpreters which may have been built with different configure 
options.

--
Nick


From martin at v.loewis.de  Sat May  7 02:11:49 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 02:11:49 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <a39457493d9660bee8d1ace89067c990@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
	<a39457493d9660bee8d1ace89067c990@opnet.com>
Message-ID: <427C07C5.7060106@v.loewis.de>

Nicholas Bastin wrote:
> Well, this is a completely separate issue/problem. The internal 
> representation is UTF-16, and should be stated as such.  If the 
> built-in methods actually don't work with surrogate pairs, then that 
> should be fixed.

Yes to the former, no to the latter. PEP 261 specifies what should
and shouldn't work.

Regards,
Martin

From martin at v.loewis.de  Sat May  7 02:15:27 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 02:15:27 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427BF842.8060604@hathawaymix.org>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>
	<427BF842.8060604@hathawaymix.org>
Message-ID: <427C089F.2010808@v.loewis.de>

Shane Hathaway wrote:
> Ok.  Thanks for helping me understand where Python is WRT unicode.  I
> can work around the issues (or maybe try to help solve them) now that I
> know the current state of affairs.  If Python correctly handled UTF-16
> strings internally, we wouldn't need the UCS-4 configuration switch,
> would we?

Define correctly. Python, in ucs2 mode, will allow to address individual
surrogate codes, e.g. in indexing. So you get

>>> u"\U00012345"[0]
u'\ud808'

This will never work "correctly", and never should, because an efficient
implementation isn't possible. If you want "safe" indexing and slicing,
you need ucs4.

Regards,
Martin

From martin at v.loewis.de  Sat May  7 02:18:47 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 02:18:47 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <76f88e8be4003929d4944c357b041b96@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<427C0113.7040203@v.loewis.de>
	<76f88e8be4003929d4944c357b041b96@opnet.com>
Message-ID: <427C0967.3030201@v.loewis.de>

Nicholas Bastin wrote:
> What I mean is pretty clear.  UCS-2 does *NOT* support surrogate pairs. 
>   If it did, it would be called UTF-16.  If Python really supported 
> UCS-2, then surrogate pairs from UTF-16 inputs would either get turned 
> into two garbage characters, or the "I couldn't transcode this" UCS-2 
> code point (I don't remember which on that is off the top of my head).

OTOH, if Python really supported UTF-16, then unichr(0x10000) would
work, and len(u"\U00010000") would be 1.

It is primarily just the UTF-8 codec which supports UTF-16.

Regards,
Martin

From martin at v.loewis.de  Sat May  7 02:25:55 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 02:25:55 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <2ee14995c7e9d4fb650fbe3844d35dcb@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<ed6dac3aa136985e60159713ff1d75ac@opnet.com>	<427B1BD6.1060206@egenix.com>	<d6016a11df6560db06fc5184e6a873bc@opnet.com>	<427C0194.3000008@v.loewis.de>
	<2ee14995c7e9d4fb650fbe3844d35dcb@opnet.com>
Message-ID: <427C0B13.1090502@v.loewis.de>

Nicholas Bastin wrote:
> Yes.  Not only in my mind, but in the Python source code.  If 
> Py_UNICODE is 4 bytes wide, then the encoding is UTF-32 (UCS-4), 
> otherwise the encoding is UTF-16 (*not* UCS-2).

I see. Some people equate "encoding" with "encoding scheme";
neither UTF-32 nor UTF-16 is an encoding scheme. You were
apparently talking about encoding forms.

> What I mean by 'variable' is that you can't make any assumption as to 
> what the size will be in any given python when you're writing (and 
> building) an extension module.  This breaks binary compatibility of 
> extensions modules on the same platform and same version of python 
> across interpreters which may have been built with different configure 
> options.

True. The breakage will be quite obvious, in most cases: the module
fails to load because not only sizeof(Py_UNICODE) changes, but also
the names of all symbols change.

Regards,
Martin

From ncoghlan at iinet.net.au  Sat May  7 04:17:44 2005
From: ncoghlan at iinet.net.au (Nick Coghlan)
Date: Sat, 07 May 2005 12:17:44 +1000
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
Message-ID: <427C2548.8010907@iinet.net.au>

PEP 340 contains several different ideas. This rewrite separates them into five 
major areas:
  - passing data into an iterator
  - finalising iterators
  - integrating finalisation into for loops
  - the new non-looping finalising statement
  - integrating all of these with generators.

The first area has nothing to do with finalisation, so it is not included in 
this rewrite (Steven Bethard wrote an Enhanced Iterators pre-PEP which covers 
only that area, though).

The whole PEP draft can be found here:
http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

But I've inlined some examples that differ from or aren't in PEP 340 for those 
that don't have time to read the whole thing (example numbers are from the PEP):

4. A template that tries something up to n times::

         def auto_retry(n=3, exc=Exception):
             for i in range(n):
                 try:
                     yield
                 except exc, err:
                     # perhaps log exception here
                     yield
             raise # re-raise the exception we caught earlier

Used as follows::

         for del auto_retry(3, IOError):
             f = urllib.urlopen("http://python.org/")
             print f.read()

6. It is easy to write a regular class with the semantics of example 1::

         class locking:
            def __init__(self, lock):
                self.lock = lock
            def __enter__(self):
                self.lock.acquire()
            def __exit__(self, type, value=None, traceback=None):
                self.lock.release()
                if type is not None:
                    raise type, value, traceback

(This example is easily modified to implement the other examples; it shows that 
generators are not always the simplest way to do things.)

8. Find the first file with a specific header::

         for name in filenames:
             stmt opening(name) as f:
                 if f.read(2) == 0xFEB0: break

9. Find the first item you can handle, holding a lock for the entire loop, or 
just for each iteration::

         stmt locking(lock):
             for item in items:
                 if handle(item): break

         for item in items:
             stmt locking(lock):
                 if handle(item): break

10. Hold a lock while inside a generator, but release it when returning control 
to the outer scope::

         stmt locking(lock):
             for item in items:
                 stmt unlocking(lock):
                     yield item


Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From bac at OCF.Berkeley.EDU  Sat May  7 05:23:26 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Fri, 06 May 2005 20:23:26 -0700
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340
In-Reply-To: <ca471dc205050613184108b3d0@mail.gmail.com>
References: <ca471dc2050506104520ed25bd@mail.gmail.com>	<002001c55264$8065f560$11bd2c81@oemcomputer>	<d11dcfba050506111622ea81e3@mail.gmail.com>	<79990c6b050506125337be4615@mail.gmail.com>	<d11dcfba05050613047e89d14b@mail.gmail.com>
	<ca471dc205050613184108b3d0@mail.gmail.com>
Message-ID: <427C34AE.3050205@ocf.berkeley.edu>

Guido van Rossum wrote:
[SNIP]
> There's one alternative possible (still orthogonal to PEP 340):
> instead of __next__(), we could add an optional argument to the next()
> method, and forget about the next() built-in. This is more compatible
> (if less future-proof). Old iterators would raise an exception when
> their next() is called with an argument, and this would be a
> reasonable way to find out that you're using "continue EXPR" with an
> iterator that doesn't support it. (The C level API would be a bit
> hairier but it can all be done in a compatible way.)
> 

I prefer the original proposal.

-Brett

From nbastin at opnet.com  Sat May  7 06:04:47 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Sat, 7 May 2005 00:04:47 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427C0B13.1090502@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<ed6dac3aa136985e60159713ff1d75ac@opnet.com>	<427B1BD6.1060206@egenix.com>	<d6016a11df6560db06fc5184e6a873bc@opnet.com>	<427C0194.3000008@v.loewis.de>
	<2ee14995c7e9d4fb650fbe3844d35dcb@opnet.com>
	<427C0B13.1090502@v.loewis.de>
Message-ID: <73d0c1d776eec4a5dee10b0e09990184@opnet.com>


On May 6, 2005, at 8:25 PM, Martin v. L?wis wrote:

> Nicholas Bastin wrote:
>> Yes.  Not only in my mind, but in the Python source code.  If
>> Py_UNICODE is 4 bytes wide, then the encoding is UTF-32 (UCS-4),
>> otherwise the encoding is UTF-16 (*not* UCS-2).
>
> I see. Some people equate "encoding" with "encoding scheme";
> neither UTF-32 nor UTF-16 is an encoding scheme. You were

That's not true.  UTF-16 and UTF-32 are both CES and CEF (although this 
is not true of UTF-16LE and BE).  UTF-32 is a fixed-width encoding form 
within a code space of (0..10FFFF) and UTF-16 is a variable-width 
encoding form which provides a mix of one of two 16-bit code units in 
the code space of (0..FFFF).  However, you are perhaps right to point 
out that people should be more explicit as to which they are referring 
to.  UCS-2, however, is only a CEF, and thus I thought it was obvious 
that I was referring to UTF-16 as a CEF.  I would point anyone who is 
confused as this point to Unicode Technical Report #17 on the Character 
Encoding Model, which is much more clear than trying to piece together 
the relevant parts out of the entire standard.

In any event, Python's use of the term UCS-2 is incorrect.  I quote 
from the TR:

"The UCS-2 encoding form, which is associated with ISO/IEC 10646 and 
can only express characters in the  BMP, is a fixed-width encoding 
form."

immediately followed by:

"In contrast, UTF-16 uses either one or two code  units and is able to 
cover the entire code space of Unicode."

If Python is capable of representing the entire code space of Unicode 
when you choose --unicode=ucs2, then that is a bug.  It either should 
not be called UCS-2, or the interpreter should be bound by the 
limitations of the UCS-2 CEF.


>> What I mean by 'variable' is that you can't make any assumption as to
>> what the size will be in any given python when you're writing (and
>> building) an extension module.  This breaks binary compatibility of
>> extensions modules on the same platform and same version of python
>> across interpreters which may have been built with different configure
>> options.
>
> True. The breakage will be quite obvious, in most cases: the module
> fails to load because not only sizeof(Py_UNICODE) changes, but also
> the names of all symbols change.

Yes, but the important question here is why would we want that?  Why 
doesn't Python just have *one* internal representation of a Unicode 
character?  Having more than one possible definition just creates 
problems, and provides no value.

--
Nick


From nbastin at opnet.com  Sat May  7 06:11:33 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Sat, 7 May 2005 00:11:33 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427C07C5.7060106@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
	<a39457493d9660bee8d1ace89067c990@opnet.com>
	<427C07C5.7060106@v.loewis.de>
Message-ID: <dcb880b2b0bee21478dcfebe3070302e@opnet.com>


On May 6, 2005, at 8:11 PM, Martin v. L?wis wrote:

> Nicholas Bastin wrote:
>> Well, this is a completely separate issue/problem. The internal
>> representation is UTF-16, and should be stated as such.  If the
>> built-in methods actually don't work with surrogate pairs, then that
>> should be fixed.
>
> Yes to the former, no to the latter. PEP 261 specifies what should
> and shouldn't work.

This PEP has several textual errors and ambiguities (which, admittedly, 
may have been a necessary state given the unicode standard in 2001).  
However, putting that aside, I would recommend that:

--enable-unicode=ucs2

be replaced with:

--enable-unicode=utf16

and the docs be updated to reflect more accurately the variance of the 
internal storage type.

I would also like the community to strongly consider standardizing on a 
single internal representation, but I will leave that fight for another 
day.

--
Nick


From michele.simionato at gmail.com  Sat May  7 07:45:30 2005
From: michele.simionato at gmail.com (Michele Simionato)
Date: Sat, 7 May 2005 01:45:30 -0400
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340
In-Reply-To: <d11dcfba05050613047e89d14b@mail.gmail.com>
References: <ca471dc2050506104520ed25bd@mail.gmail.com>
	<002001c55264$8065f560$11bd2c81@oemcomputer>
	<d11dcfba050506111622ea81e3@mail.gmail.com>
	<79990c6b050506125337be4615@mail.gmail.com>
	<d11dcfba05050613047e89d14b@mail.gmail.com>
Message-ID: <4edc17eb05050622452c13d600@mail.gmail.com>

On 5/6/05, Steven Bethard <steven.bethard at gmail.com> wrote:
> FWIW, I'm +1 on this.  Enhanced Iterators
>  * updates the iterator protocol to use .__next__() instead of .next()
>  * introduces a new builtin next()
>  * allows continue-statements to pass values to iterators
>  * allows generators to receive values with a yield-expression
> The first two are, I believe, how the iterator protocol probably
> should have been in the first place.  The second two provide a simple
> way of passing values to generators, something I got the impression
> that the co-routiney people would like a lot.

Thank you for splitting the PEP. Conceptually, the "coroutine" part  
has nothing to do with blocks and it stands on its own, it is right
to discuss it separately from the block syntax.

Personally, I do not see an urgent need for the block syntax (most of
the use case can be managed with decorators) nor for the "couroutine"
syntax (you can already use Armin Rigo's greenlets for that).

Anyway, the idea of passing arguments to generators is pretty cool,
here is some code I have, adapted from Armin's presentation at the
ACCU conference:

from py.magic import greenlet

def yield_(*args):
    return greenlet.getcurrent().parent.switch(*args)

def send(key):
    return process_commands.switch(key)

@greenlet
def process_commands():
    while True:
        line = ''
        while not line.endswith('\n'):
            line += yield_()
        print line,
        if line == 'quit\n':
            print "are you sure?"
            if yield_() == 'y':
                break
            
process_commands.switch() # start the greenlet

send("h")
send("e")
send("l")
send("l")
send("o")
send("\n")

send("q")
send("u")
send("i")
send("t")
send("\n")
  

Michele Simionato

From shane at hathawaymix.org  Sat May  7 10:05:42 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Sat, 07 May 2005 02:05:42 -0600
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427C089F.2010808@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>
	<427BF842.8060604@hathawaymix.org> <427C089F.2010808@v.loewis.de>
Message-ID: <427C76D6.4050409@hathawaymix.org>

Martin v. L?wis wrote:
> Define correctly. Python, in ucs2 mode, will allow to address individual
> surrogate codes, e.g. in indexing. So you get
> 
> 
>>>>u"\U00012345"[0]

When Python encodes characters internally in UCS-2, I would expect
u"\U00012345" to produce a UnicodeError("character can not be encoded in
UCS-2").

> u'\ud808'
> 
> This will never work "correctly", and never should, because an efficient
> implementation isn't possible. If you want "safe" indexing and slicing,
> you need ucs4.

I agree that UCS4 is needed.  There is a balancing act here; UTF-16 is
widely used and takes less space, while UCS4 is easier to treat as an
array of characters.  Maybe we can have both: unicode objects start with
an internal representation in UTF-16, but get promoted automatically to
UCS4 when you index or slice them.  The difference will not be visible
to Python code.  A compile-time switch will not be necessary.  What do
you think?

Shane

From shane at hathawaymix.org  Sat May  7 11:14:21 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Sat, 07 May 2005 03:14:21 -0600
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427C0078.3090806@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>
	<427946B9.6070500@v.loewis.de> <42794ABD.2080405@hathawaymix.org>
	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
	<427AB26B.2040004@hathawaymix.org> <427C0078.3090806@v.loewis.de>
Message-ID: <427C86ED.7000805@hathawaymix.org>

Martin v. L?wis wrote:
> Shane Hathaway wrote:
>>More generally, how should a non-unicode-expert writing Python extension
>>code find out the minimum they need to know about unicode to use the
>>Python unicode API?  The API reference [1] ought to at least have a list
>>of background links.  I had to hunt everywhere.
> 
> That, of course, depends on what your background is. Did you know what
> Latin-1 is, when you started? How it relates to code page 1252? What
> UTF-8 is? What an abstract character is, as opposed to a byte sequence
> on the one hand, and to a glyph on the other hand?
>
> Different people need different background, especially if they are
> writing different applications.

Yes, but the first few steps are the same for nearly everyone, and
people need more help taking the first few steps.  In particular:

- The Python docs link to unicode.org, but unicode.org is complicated,
long-winded, and leaves many questions unanswered.  The Wikipedia
article is far better.  I wish I had thought to look there instead.

  http://en.wikipedia.org/wiki/Unicode

- The docs should say what to expect to happen when a large unicode
character winds up in a Py_UNICODE array.  For instance, what is
len(u'\U00012345')?  1 or 2?  Does the answer depend on the UCS4
compile-time switch?

- The docs should help developers evaluate whether they need the UCS4
compile-time switch.  Is UCS2 good enough for Asia?  For math?  For
hieroglyphics? <wink>

Shane

From martin at v.loewis.de  Sat May  7 15:24:49 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 15:24:49 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <73d0c1d776eec4a5dee10b0e09990184@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<ed6dac3aa136985e60159713ff1d75ac@opnet.com>	<427B1BD6.1060206@egenix.com>	<d6016a11df6560db06fc5184e6a873bc@opnet.com>	<427C0194.3000008@v.loewis.de>
	<2ee14995c7e9d4fb650fbe3844d35dcb@opnet.com>
	<427C0B13.1090502@v.loewis.de>
	<73d0c1d776eec4a5dee10b0e09990184@opnet.com>
Message-ID: <427CC1A1.4080206@v.loewis.de>

Nicholas Bastin wrote:
> Yes, but the important question here is why would we want that?  Why
> doesn't Python just have *one* internal representation of a Unicode
> character?  Having more than one possible definition just creates
> problems, and provides no value.

It does provide value, there are good reasons for each setting. Which
of the two alternatives do you consider useless?

Regards,
Martin

From martin at v.loewis.de  Sat May  7 15:29:38 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 15:29:38 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <dcb880b2b0bee21478dcfebe3070302e@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
	<a39457493d9660bee8d1ace89067c990@opnet.com>
	<427C07C5.7060106@v.loewis.de>
	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>
Message-ID: <427CC2C2.60000@v.loewis.de>

Nicholas Bastin wrote:
> --enable-unicode=ucs2
> 
> be replaced with:
> 
> --enable-unicode=utf16
> 
> and the docs be updated to reflect more accurately the variance of the
> internal storage type.

-1. This breaks existing documentation and usage, and provides only
minimum value.

With --enable-unicode=ucs2, Python's Py_UNICODE does *not* start
supporting the full Unicode ccs the same way it supports UCS-2.
Individual surrogate values remain accessible, and supporting
non-BMP characters is left to the application (with the exception
of the UTF-8 codec).

Regards,
Martin



From martin at v.loewis.de  Sat May  7 15:34:27 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 15:34:27 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427C76D6.4050409@hathawaymix.org>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>
	<427BF842.8060604@hathawaymix.org> <427C089F.2010808@v.loewis.de>
	<427C76D6.4050409@hathawaymix.org>
Message-ID: <427CC3E3.4090405@v.loewis.de>

Shane Hathaway wrote:
> I agree that UCS4 is needed.  There is a balancing act here; UTF-16 is
> widely used and takes less space, while UCS4 is easier to treat as an
> array of characters.  Maybe we can have both: unicode objects start with
> an internal representation in UTF-16, but get promoted automatically to
> UCS4 when you index or slice them.  The difference will not be visible
> to Python code.  A compile-time switch will not be necessary.  What do
> you think?

This breaks backwards compatibility with existing extension modules.
Applications that do PyUnicode_AsUnicode get a Py_UNICODE*, and
can use that to directly access the characters.

Regards,
Martin

From martin at v.loewis.de  Sat May  7 15:35:57 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 15:35:57 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427C86ED.7000805@hathawaymix.org>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>
	<427946B9.6070500@v.loewis.de> <42794ABD.2080405@hathawaymix.org>
	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>
	<427AB26B.2040004@hathawaymix.org> <427C0078.3090806@v.loewis.de>
	<427C86ED.7000805@hathawaymix.org>
Message-ID: <427CC43D.3010603@v.loewis.de>

> Yes, but the first few steps are the same for nearly everyone, and
> people need more help taking the first few steps.

Contributions to the documentation are certainly welcome.

Regards,
Martin

From oren.tirosh at gmail.com  Sat May  7 18:56:28 2005
From: oren.tirosh at gmail.com (Oren Tirosh)
Date: Sat, 7 May 2005 19:56:28 +0300
Subject: [Python-Dev] Proposed alternative to __next__ and __exit__
Message-ID: <7168d65a05050709561b557da2@mail.gmail.com>

I suggest using a variation on the consumer interface, as described by
Fredrik Lundh at http://effbot.org/zone/consumer.htm :

.next() -- stays .next()
.__next__(arg) --  becomes .feed(arg)
.__exit__(StopIteration, ...) -- becomes .close()
.__exit__(..,..,..) -- becomes .feed(exc_info=(..,..,..))   

Extensions to effbot's original consumer interface:
1. The .feed() method may return a value 
2. Some way to raise an exception other than StopIteration inside the
generator/consumer function.  The use of a keyword argument to .feed
is just an example. I'm looking for other suggestions on this one.

No new builtins. No backward-compatibility methods and wrappers.

Yes, it would have been nicer if .next() had been called __next__() in
the first place. But at this stage I feel that the cost of "fixing" it
far outweighs any perceived benefit.

so much for "uncontroversial" parts!  :-)

  Oren


On 5/6/05, Guido van Rossum <gvanrossum at gmail.com> wrote:
> [Steven Bethard]
> > So, just to make sure, if we had another PEP that contained from PEP 340[1]:
> >  * Specification: the __next__() Method
> >  * Specification: the next() Built-in Function
> >  * Specification: a Change to the 'for' Loop
> >  * Specification: the Extended 'continue' Statement
> >  * the yield-expression part of Specification: Generator Exit Handling
> > would that cover all the pieces you're concerned about?
> >
> > I'd be willing to break these off into a separate PEP if people think
> > it's a good idea.  I've seen very few complaints about any of these
> > pieces of the proposal.  If possible, I'd like to see these things
> > approved now, so that the discussion could focus more directly on the
> > block-statement issues.
> 
> I don't think it's necessary to separate this out into a separate PEP;
> that just seems busy-work. I agree these parts are orthogonal and
> uncontroversial; a counter-PEP can suffice by stating that it's not
> countering those items nor repeating them.
> 
> --
> --Guido van Rossum (home page: http://www.python.org/~guido/)
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: http://mail.python.org/mailman/options/python-dev/oren.tirosh%40gmail.com
>

From shane at hathawaymix.org  Sat May  7 20:00:43 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Sat, 07 May 2005 12:00:43 -0600
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427CC3E3.4090405@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>
	<427BF842.8060604@hathawaymix.org> <427C089F.2010808@v.loewis.de>
	<427C76D6.4050409@hathawaymix.org> <427CC3E3.4090405@v.loewis.de>
Message-ID: <427D024B.6080207@hathawaymix.org>

Martin v. L?wis wrote:
> Shane Hathaway wrote:
> 
>>I agree that UCS4 is needed.  There is a balancing act here; UTF-16 is
>>widely used and takes less space, while UCS4 is easier to treat as an
>>array of characters.  Maybe we can have both: unicode objects start with
>>an internal representation in UTF-16, but get promoted automatically to
>>UCS4 when you index or slice them.  The difference will not be visible
>>to Python code.  A compile-time switch will not be necessary.  What do
>>you think?
> 
> 
> This breaks backwards compatibility with existing extension modules.
> Applications that do PyUnicode_AsUnicode get a Py_UNICODE*, and
> can use that to directly access the characters.

Py_UNICODE would always be 32 bits wide.  PyUnicode_AsUnicode would
cause the unicode object to be promoted automatically.  Extensions that
break as a result are technically broken already, aren't they?  They're
not supposed to depend on the size of Py_UNICODE.

Shane

From martin at v.loewis.de  Sat May  7 20:33:02 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sat, 07 May 2005 20:33:02 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427D024B.6080207@hathawaymix.org>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>
	<427BF842.8060604@hathawaymix.org> <427C089F.2010808@v.loewis.de>
	<427C76D6.4050409@hathawaymix.org> <427CC3E3.4090405@v.loewis.de>
	<427D024B.6080207@hathawaymix.org>
Message-ID: <427D09DE.1020705@v.loewis.de>

Shane Hathaway wrote:
> Py_UNICODE would always be 32 bits wide.

This would break PythonWin, which relies on Py_UNICODE being
the same as WCHAR_T. PythonWin is not broken, it just hasn't
been ported to UCS-4, yet (and porting this is difficult and
will cause a performance loss).

Regards,
Martin



From mal at egenix.com  Sat May  7 20:41:31 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Sat, 07 May 2005 20:41:31 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427D024B.6080207@hathawaymix.org>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>	<427BF842.8060604@hathawaymix.org>
	<427C089F.2010808@v.loewis.de>	<427C76D6.4050409@hathawaymix.org>
	<427CC3E3.4090405@v.loewis.de> <427D024B.6080207@hathawaymix.org>
Message-ID: <427D0BDB.6050802@egenix.com>

Shane Hathaway wrote:
> Martin v. L?wis wrote:
> 
>>Shane Hathaway wrote:
>>
>>
>>>I agree that UCS4 is needed.  There is a balancing act here; UTF-16 is
>>>widely used and takes less space, while UCS4 is easier to treat as an
>>>array of characters.  Maybe we can have both: unicode objects start with
>>>an internal representation in UTF-16, but get promoted automatically to
>>>UCS4 when you index or slice them.  The difference will not be visible
>>>to Python code.  A compile-time switch will not be necessary.  What do
>>>you think?
>>
>>
>>This breaks backwards compatibility with existing extension modules.
>>Applications that do PyUnicode_AsUnicode get a Py_UNICODE*, and
>>can use that to directly access the characters.
> 
> 
> Py_UNICODE would always be 32 bits wide.  PyUnicode_AsUnicode would
> cause the unicode object to be promoted automatically.  Extensions that
> break as a result are technically broken already, aren't they?  They're
> not supposed to depend on the size of Py_UNICODE.

-1.

You are free to compile Python with --enable-unicode=ucs4
if you prefer this setting.

I don't see any reason why we should force users to invest 4 bytes
of storage for each Unicode code point - 2 bytes work just fine
and can represent all Unicode characters that are currently
defined (using surrogates if necessary). As more and more
Unicode objects are used in a process, choosing UCS2 vs. UCS4
does make a huge difference in terms of used memory.

All this talk about UTF-16 vs. UCS-2 is not very useful
and strikes me a purely academic.

The reference to possibly breakage by slicing a Unicode and
breaking a surrogate pair is valid, the idea of UCS-4 being
less prone to breakage is a myth:

Unicode has many code points that are meant only for composition
and don't have any standalone meaning, e.g. a combining acute
accent (U+0301), yet they are perfectly valid code points -
regardless of UCS-2 or UCS-4. It is easily possible to break
such a combining sequence using slicing, so the most
often presented argument for using UCS-4 instead of UCS-2
(+ surrogates) is rather weak if seen by daylight.

Some may now say that combining sequences are not used
all that often. However, they play a central role in Unicode
normalization (http://www.unicode.org/reports/tr15/),
which is needed whenever you want to semantically
compare Unicode objects and are

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 07 2005)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From mal at egenix.com  Sat May  7 20:52:48 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Sat, 07 May 2005 20:52:48 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427C029D.3090907@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>	<427BCD3B.1000201@egenix.com>
	<427C029D.3090907@v.loewis.de>
Message-ID: <427D0E80.4080502@egenix.com>

Martin v. L?wis wrote:
> M.-A. Lemburg wrote:
> 
>>Hmm, looking at the configure.in script, it seems you're right.
>>I wonder why this weird dependency on TCL was added.
> 
> 
> If Python is configured for UCS-2, and Tcl for UCS-4, then
> Tkinter would not work out of the box. Hence the weird dependency.

I believe that it would be more appropriate to adjust the _tkinter
module to adapt to the TCL Unicode size rather than
forcing the complete Python system to adapt to TCL - I don't
really see the point in an optional extension module
defining the default for the interpreter core.

At the very least, this should be a user controlled option.

Otherwise, we might as well use sizeof(wchar_t) as basis
for the default Unicode size. This at least, would be
a much more reasonable choice than whatever TCL uses.

-
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 07 2005)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From nbastin at opnet.com  Sat May  7 22:09:26 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Sat, 7 May 2005 16:09:26 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427CC1A1.4080206@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<ed6dac3aa136985e60159713ff1d75ac@opnet.com>	<427B1BD6.1060206@egenix.com>	<d6016a11df6560db06fc5184e6a873bc@opnet.com>	<427C0194.3000008@v.loewis.de>
	<2ee14995c7e9d4fb650fbe3844d35dcb@opnet.com>
	<427C0B13.1090502@v.loewis.de>
	<73d0c1d776eec4a5dee10b0e09990184@opnet.com>
	<427CC1A1.4080206@v.loewis.de>
Message-ID: <7d4d7235fa0855a51c733bd622e264cb@opnet.com>


On May 7, 2005, at 9:24 AM, Martin v. L?wis wrote:

> Nicholas Bastin wrote:
>> Yes, but the important question here is why would we want that?  Why
>> doesn't Python just have *one* internal representation of a Unicode
>> character?  Having more than one possible definition just creates
>> problems, and provides no value.
>
> It does provide value, there are good reasons for each setting. Which
> of the two alternatives do you consider useless?

I don't consider either alternative useless (well, I consider UCS-2 to 
be largely useless in the general case, but as we've already discussed 
here, Python isn't really UCS-2).  However, I would be a lot happier if 
we just chose *one*, and all Python's used that one.  This would make 
extension module distribution a lot easier.

I'd prefer UTF-16, but I would be perfectly happy with UCS-4.

--
Nick


From nbastin at opnet.com  Sat May  7 22:13:55 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Sat, 7 May 2005 16:13:55 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427CC2C2.60000@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
	<a39457493d9660bee8d1ace89067c990@opnet.com>
	<427C07C5.7060106@v.loewis.de>
	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>
	<427CC2C2.60000@v.loewis.de>
Message-ID: <a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>


On May 7, 2005, at 9:29 AM, Martin v. L?wis wrote:

> Nicholas Bastin wrote:
>> --enable-unicode=ucs2
>>
>> be replaced with:
>>
>> --enable-unicode=utf16
>>
>> and the docs be updated to reflect more accurately the variance of the
>> internal storage type.
>
> -1. This breaks existing documentation and usage, and provides only
> minimum value.

Have you been missing this conversation?  UTF-16 is *WHAT PYTHON 
CURRENTLY IMPLEMENTS*.  The current documentation is flat out wrong.  
Breaking that isn't a big problem in my book.

It provides more than minimum value - it provides the truth.


> With --enable-unicode=ucs2, Python's Py_UNICODE does *not* start
> supporting the full Unicode ccs the same way it supports UCS-2.
> Individual surrogate values remain accessible, and supporting
> non-BMP characters is left to the application (with the exception
> of the UTF-8 codec).

I can't understand what you mean by this.  My point is that if you 
configure python to support UCS-2, then it SHOULD NOT support surrogate 
pairs.  Supporting surrogate paris is the purvey of variable width 
encodings, and UCS-2 is not among them.

--
Nick


From eric.nieuwland at xs4all.nl  Sat May  7 22:36:01 2005
From: eric.nieuwland at xs4all.nl (Eric Nieuwland)
Date: Sat, 7 May 2005 22:36:01 +0200
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <427C2548.8010907@iinet.net.au>
References: <427C2548.8010907@iinet.net.au>
Message-ID: <f18e9225b9d7fbc665008cc97f8d8283@xs4all.nl>

Nick Coghlan wrote:

> [...]
> The whole PEP draft can be found here:
> http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
> [...]
> Used as follows::
>
>          for del auto_retry(3, IOError):
>              f = urllib.urlopen("http://python.org/")
>              print f.read()

I don't know. Using 'del' in that place seems ackward to me.
Why not use the following rule:
	for [VAR in] EXPR:
		SUITE
If EXPR is an iterator, no finalisation is done.
If EXPR is not an iterator, it is created at the start and destroyed at 
the end of the loop.

--eric


From mal at egenix.com  Sat May  7 23:09:38 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Sat, 07 May 2005 23:09:38 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427C07C5.7060106@v.loewis.de>	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>	<427CC2C2.60000@v.loewis.de>
	<a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>
Message-ID: <427D2E92.9020303@egenix.com>

Nicholas Bastin wrote:
> On May 7, 2005, at 9:29 AM, Martin v. L?wis wrote:
>>With --enable-unicode=ucs2, Python's Py_UNICODE does *not* start
>>supporting the full Unicode ccs the same way it supports UCS-2.
>>Individual surrogate values remain accessible, and supporting
>>non-BMP characters is left to the application (with the exception
>>of the UTF-8 codec).
> 
> I can't understand what you mean by this.  My point is that if you 
> configure python to support UCS-2, then it SHOULD NOT support surrogate 
> pairs.  Supporting surrogate paris is the purvey of variable width 
> encodings, and UCS-2 is not among them.

Surrogate pairs are only supported by the UTF-8 and UTF-16
codecs (and a few others), not the Python Unicode
implementation itself - this treats surrogate code
points just like any other Unicode code point.

This allows us to be flexible and efficient in the implementation
while guaranteeing the round-trip safety of Unicode data processed
through Python.

Your complaint about the documentation (which started this
thread) is valid.

However, I don't understand all the excitement
about Py_UNICODE: if you don't like the way this Python
typedef works, you are free to interface to Python using
any of the supported encodings using PyUnicode_Encode()
and PyUnicode_Decode(). I'm sure you'll find one that
fits your needs and if not, you can even write your
own codec and register it with Python, e.g. UTF-32
which we currently don't support ;-)

Please upload your doc-patch to SF.

Thanks,
-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 07 2005)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From jcarlson at uci.edu  Sat May  7 23:08:22 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Sat, 07 May 2005 14:08:22 -0700
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <f18e9225b9d7fbc665008cc97f8d8283@xs4all.nl>
References: <427C2548.8010907@iinet.net.au>
	<f18e9225b9d7fbc665008cc97f8d8283@xs4all.nl>
Message-ID: <20050507140547.64F0.JCARLSON@uci.edu>


Eric Nieuwland <eric.nieuwland at xs4all.nl> wrote:
> 
> Nick Coghlan wrote:
> 
> > [...]
> > The whole PEP draft can be found here:
> > http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
> > [...]
> > Used as follows::
> >
> >          for del auto_retry(3, IOError):
> >              f = urllib.urlopen("http://python.org/")
> >              print f.read()
> 
> I don't know. Using 'del' in that place seems ackward to me.
> Why not use the following rule:
> 	for [VAR in] EXPR:
> 		SUITE
> If EXPR is an iterator, no finalisation is done.
> If EXPR is not an iterator, it is created at the start and destroyed at 
> the end of the loop.

You should know why that can't work.  If I pass a list, is a list an
iterator?  No, but it should neither be created nor destroyed before or
after.

The discussion has been had in regards to why re-using 'for' is a
non-starter; re-read the 200+ messages in the thread.

 - Josiah


From nbastin at opnet.com  Sat May  7 23:19:49 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Sat, 7 May 2005 17:19:49 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427D2E92.9020303@egenix.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427C07C5.7060106@v.loewis.de>	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>	<427CC2C2.60000@v.loewis.de>
	<a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>
	<427D2E92.9020303@egenix.com>
Message-ID: <44da226481ded93db03f3e4c562a0bf9@opnet.com>


On May 7, 2005, at 5:09 PM, M.-A. Lemburg wrote:

> However, I don't understand all the excitement
> about Py_UNICODE: if you don't like the way this Python
> typedef works, you are free to interface to Python using
> any of the supported encodings using PyUnicode_Encode()
> and PyUnicode_Decode(). I'm sure you'll find one that
> fits your needs and if not, you can even write your
> own codec and register it with Python, e.g. UTF-32
> which we currently don't support ;-)

My concerns about Py_UNICODE are completely separate from my 
frustration that the documentation is wrong about this type.  It is 
much more important that the documentation be correct, first, and then 
we can discuss the reasons why it can be one of two values, rather than 
just a uniform value across all python implementations.  This makes 
distributing binary extension modules hard.  It has become clear to me 
that no one on this list gives a *%&^ about people attempting to 
distribute binary extension modules, or they would have cared about 
this problem, so I'll just drop that point.

However, somehow, what keeps getting lost in the mix is that 
--enable-unicode=ucs2 is a lie, and we should change what this 
configure option says.  Martin seems to disagree with me, for reasons 
that I don't understand.  I would be fine with calling the option 
utf16, or just 2 and 4, but not ucs2, as that means things that Python 
doesn't intend it to mean.

--
Nick


From nbastin at opnet.com  Sat May  7 23:24:18 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Sat, 7 May 2005 17:24:18 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427D2E92.9020303@egenix.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427C07C5.7060106@v.loewis.de>	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>	<427CC2C2.60000@v.loewis.de>
	<a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>
	<427D2E92.9020303@egenix.com>
Message-ID: <f9c6d597427a414ee75e2cd3b683d09a@opnet.com>


On May 7, 2005, at 5:09 PM, M.-A. Lemburg wrote:

> Please upload your doc-patch to SF.

All of my proposals for what to change the documention to have been 
shot down by Martin.  If someone has better verbiage that they'd like 
to see, I'd be perfectly happy to patch the doc.

My last suggestion was:

"This type represents the storage type which is used by Python 
internally as the basis for holding Unicode ordinals.  Extension module 
developers should make no assumptions about the size of this type on 
any given platform."

--
Nick


From mal at egenix.com  Sat May  7 23:41:18 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Sat, 07 May 2005 23:41:18 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <44da226481ded93db03f3e4c562a0bf9@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427C07C5.7060106@v.loewis.de>	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>	<427CC2C2.60000@v.loewis.de>	<a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>	<427D2E92.9020303@egenix.com>
	<44da226481ded93db03f3e4c562a0bf9@opnet.com>
Message-ID: <427D35FE.5040109@egenix.com>

Nicholas Bastin wrote:
> On May 7, 2005, at 5:09 PM, M.-A. Lemburg wrote:
> 
> 
>>However, I don't understand all the excitement
>>about Py_UNICODE: if you don't like the way this Python
>>typedef works, you are free to interface to Python using
>>any of the supported encodings using PyUnicode_Encode()
>>and PyUnicode_Decode(). I'm sure you'll find one that
>>fits your needs and if not, you can even write your
>>own codec and register it with Python, e.g. UTF-32
>>which we currently don't support ;-)
> 
> 
> My concerns about Py_UNICODE are completely separate from my 
> frustration that the documentation is wrong about this type.  It is 
> much more important that the documentation be correct, first, and then 
> we can discuss the reasons why it can be one of two values, rather than 
> just a uniform value across all python implementations.  This makes 
> distributing binary extension modules hard.  It has become clear to me 
> that no one on this list gives a *%&^ about people attempting to 
> distribute binary extension modules, or they would have cared about 
> this problem, so I'll just drop that point.

Actually, many of us know about the problem of having to
ship UCS2 and UCS4 builds of binary extensions and the
troubles this causes with users.

It just adds one more dimension to the number of builds
you have to make - one for the Python version, another
for the platform and in the case of Linux another one for
the Unicode width. Nowadays most Linux distros ship UCS4
builds (after RedHat started this quest), so things start
to normalize again.

> However, somehow, what keeps getting lost in the mix is that 
> --enable-unicode=ucs2 is a lie, and we should change what this 
> configure option says.  Martin seems to disagree with me, for reasons 
> that I don't understand.  I would be fine with calling the option 
> utf16, or just 2 and 4, but not ucs2, as that means things that Python 
> doesn't intend it to mean.

It's not a lie: the Unicode implementation does work with
UCS2 code points (surrogate values are Unicode code points as
well - they happen to live in a special zone of the BMP).

Only the codecs add support for surrogates in a way that
allows round-trip safety regardless of whether you used UCS2
or UCS4 as compile time option.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 07 2005)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From rrr at ronadam.com  Sun May  8 00:42:51 2005
From: rrr at ronadam.com (Ron Adam)
Date: Sat, 07 May 2005 18:42:51 -0400
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <20050507140547.64F0.JCARLSON@uci.edu>
References: <427C2548.8010907@iinet.net.au>	<f18e9225b9d7fbc665008cc97f8d8283@xs4all.nl>
	<20050507140547.64F0.JCARLSON@uci.edu>
Message-ID: <427D446B.3080707@ronadam.com>


Josiah Carlson wrote:

 > You should know why that can't work.  If I pass a list, is a list an
 > iterator?  No, but it should neither be created nor destroyed before or
 > after.
 >
 > The discussion has been had in regards to why re-using 'for' is a
 > non-starter; re-read the 200+ messages in the thread.
 >
 >  - Josiah


I agree, re-using or extending 'for' doesn't seem like a good idea to me.

I wonder how much effect adding, 'for-next' and the 'StopIteration' 
exception check as proposed in PEP340, will have on 'for''s performance.

And why this isn't just as good:

     try:
         for value in iterator:
             BLOCK1
     except StopIteration:
         BLOCK2

Is one extra line that bad?


I think a completely separate looping or non-looping construct would be 
better for the finalization issue, and maybe can work with class's with 
__exit__ as well as generators.

Having it loop has the advantage of making it break out in a better 
behaved way.  So may be Nicks PEP, would work better with a different 
keyword?

Hint: 'do'

Cheers,
Ron_Adam


From bob at redivi.com  Sun May  8 01:57:40 2005
From: bob at redivi.com (Bob Ippolito)
Date: Sat, 7 May 2005 19:57:40 -0400
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340
In-Reply-To: <4edc17eb05050622452c13d600@mail.gmail.com>
References: <ca471dc2050506104520ed25bd@mail.gmail.com>
	<002001c55264$8065f560$11bd2c81@oemcomputer>
	<d11dcfba050506111622ea81e3@mail.gmail.com>
	<79990c6b050506125337be4615@mail.gmail.com>
	<d11dcfba05050613047e89d14b@mail.gmail.com>
	<4edc17eb05050622452c13d600@mail.gmail.com>
Message-ID: <87FF0468-3585-4985-8F33-7AFEB9AE6F79@redivi.com>

On May 7, 2005, at 1:45 AM, Michele Simionato wrote:

> On 5/6/05, Steven Bethard <steven.bethard at gmail.com> wrote:
>
>> FWIW, I'm +1 on this.  Enhanced Iterators
>>  * updates the iterator protocol to use .__next__() instead  
>> of .next()
>>  * introduces a new builtin next()
>>  * allows continue-statements to pass values to iterators
>>  * allows generators to receive values with a yield-expression
>> The first two are, I believe, how the iterator protocol probably
>> should have been in the first place.  The second two provide a simple
>> way of passing values to generators, something I got the impression
>> that the co-routiney people would like a lot.
>>
>
> Thank you for splitting the PEP. Conceptually, the "coroutine" part
> has nothing to do with blocks and it stands on its own, it is right
> to discuss it separately from the block syntax.
>
> Personally, I do not see an urgent need for the block syntax (most of
> the use case can be managed with decorators) nor for the "couroutine"
> syntax (you can already use Armin Rigo's greenlets for that).

While Armin's greenlets are really cool they're also really dangerous  
when you're integrating with C code, especially event loops and  
such.  Language support would be MUCH better.

-bob


From ncoghlan at gmail.com  Sun May  8 06:16:40 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 08 May 2005 14:16:40 +1000
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <427D446B.3080707@ronadam.com>
References: <427C2548.8010907@iinet.net.au>	<f18e9225b9d7fbc665008cc97f8d8283@xs4all.nl>	<20050507140547.64F0.JCARLSON@uci.edu>
	<427D446B.3080707@ronadam.com>
Message-ID: <427D92A8.7040508@gmail.com>

Ron Adam wrote:
> I agree, re-using or extending 'for' doesn't seem like a good idea to me.

I agree that re-using a straight 'for' loop is out, due to performance and 
compatibility issues with applying finalisation semantics to all such iterative 
loops (there's a reason the PEP redraft doesn't suggest this).

However, it makes sense to me that a "for loop with finalisation" should 
actually *be* a 'for' loop - just with some extra syntax to indicate that the 
iterator is finalised at the end of the loop.

An option other than the one in my PEP draft would be to put 'del' at the end of 
the line instead of before EXPR:

   for [VAR in] EXPR [del]:
       BLOCK1
   else:
       BLOCK2

However, as you say, 'del' isn't great for the purpose, but I was trying to 
avoid introduding yet another keyword. An obvious alternative is to use 
'finally' instead:

   for [finally] [VAR in] EXPR:
       BLOCK1
   else:
       BLOCK2

It still doesn't read all that well, but at least the word more accurately 
reflects the semantics involved.

If a new keyword is used to request iterator finalisation, it should probably 
include the word 'for' since it *is* a for loop:

   foreach [VAR in] EXPR:
       BLOCK1
   else:
       BLOCK2

That is, a basic 'for' loop wouldn't finalise the iterator, but a 'foreach' loop 
would. The other difference is that the iterator in the 'foreach' loop has the 
chance to suppress exceptions other than TerminateBlock/StopIteration (by 
refusing to be finalised in response to the exception).

The last option is to leave finalisation out of the 'for' loop syntax, and 
introduce a user defined statement to handle the finalisation:

   def consuming(iterable):
       itr = iter(iterable)
       try:
           yield itr
       finally:
           itr_exit = getattr(itr, "__exit__", None)
           if itr_exit is not None:
               try:
                   itr_exit(TerminateBlock)
               except TerminateBlock:
                   pass

   stmt consuming(iterable) as itr:
       for item in itr:
           process(item)

With this approach, iterators can't swallow exceptions. This means that 
something like auto_retry() would once again have to be written as a class:

   class auto_retry(3, IOError):
       def __init__(self, times, exc=Exception):
           self.times = xrange(times-1)
           self.exc = exc
           self.succeeded = False

       def __iter__(self):
           attempt = self.attempt
           for i in self.times:
               yield attempt()
               if self.succeeded:
                   break
           else:
               yield self.last_attempt()

       def attempt(self):
           try:
               yield
               self.succeeded = True
           except self.exc:
               pass

       def last_attempt(self):
           yield


   for attempt in auto_retry(3, IOError):
        stmt attempt:
            # Do something!
            # Including break to give up early
            # Or continue to try again without raising IOError

> I wonder how much effect adding, 'for-next' and the 'StopIteration' 
> exception check as proposed in PEP340, will have on 'for''s performance.

I'm not sure what you mean here - 'for' loops already use a StopIteration raised 
by the iterator to indicate that the loop is complete. The code you posted can't 
work, since it also intercepts a StopIteration raised in the body of the loop.

> I think a completely separate looping or non-looping construct would be 
> better for the finalization issue, and maybe can work with class's with 
> __exit__ as well as generators.

The PEP redraft already proposes a non-looping version as a new statement. 
However, since generators are likely to start using the new non-looping 
statement, it's important to be able to ensure timely finalisation of normal 
iterators as well. Tim and Greg's discussion the other day convinced me of this 
- that's why the idea of using 'del' to mark a finalised loop made its way into 
the draft. It can be done using a user defined statement (as shown above), but 
it would be nicer to have something built into the 'for' loop syntax to handle it.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From jcarlson at uci.edu  Sun May  8 06:21:47 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Sat, 07 May 2005 21:21:47 -0700
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <427D446B.3080707@ronadam.com>
References: <20050507140547.64F0.JCARLSON@uci.edu>
	<427D446B.3080707@ronadam.com>
Message-ID: <20050507210108.64F4.JCARLSON@uci.edu>


Ron Adam <rrr at ronadam.com> wrote:
> Josiah Carlson wrote:
> 
>  > You should know why that can't work.  If I pass a list, is a list an
>  > iterator?  No, but it should neither be created nor destroyed before or
>  > after.
>  >
>  > The discussion has been had in regards to why re-using 'for' is a
>  > non-starter; re-read the 200+ messages in the thread.
>  >
>  >  - Josiah
> 
> 
> I agree, re-using or extending 'for' doesn't seem like a good idea to me.

Now that I've actually stopped to read Nick's PEP, my concern is that
'del', while being a keyword, would not be easy to spot embedded in the
rest of the line, and a large number of these 'statements' will only be
executed once, so the 'for' may confuse people.


> I wonder how much effect adding, 'for-next' and the 'StopIteration' 
> exception check as proposed in PEP340, will have on 'for''s performance.

For is already tuned to be as fast as possible, which makes sense; it is
used 4,523 times in Python 2.4.0's standard library, and easily hundreds
of thousands of times in user code.  Changing the standard for loop is
not to be done lightly.


> And why this isn't just as good:
> 
>      try:
>          for value in iterator:
>              BLOCK1
>      except StopIteration:
>          BLOCK2
> 
> Is one extra line that bad?

I don't know what line you are referring to.

> I think a completely separate looping or non-looping construct would be 
> better for the finalization issue, and maybe can work with class's with 
> __exit__ as well as generators.

From what I understand, the entire conversation has always stated that
class-based finalized objects and generator-based finalized objects will
both work, and that any proposal that works for one, but not the other,
is not sufficient.


> Having it loop has the advantage of making it break out in a better 
> behaved way.

What you have just typed is nonsense.  Re-type it and be explicit.


> Hint: 'do'

'do' has been previously mentioned in the thread.

 - Josiah


From exarkun at divmod.com  Sun May  8 06:32:01 2005
From: exarkun at divmod.com (Jp Calderone)
Date: Sun, 08 May 2005 04:32:01 GMT
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <427D92A8.7040508@gmail.com>
Message-ID: <20050508043201.15422.945259482.divmod.quotient.25206@ohm>

On Sun, 08 May 2005 14:16:40 +1000, Nick Coghlan <ncoghlan at gmail.com> wrote:
>Ron Adam wrote:
>> I agree, re-using or extending 'for' doesn't seem like a good idea to me.
>
>I agree that re-using a straight 'for' loop is out, due to performance and
>compatibility issues with applying finalisation semantics to all such iterative
>loops (there's a reason the PEP redraft doesn't suggest this).
>
>However, it makes sense to me that a "for loop with finalisation" should
>actually *be* a 'for' loop - just with some extra syntax to indicate that the
>iterator is finalised at the end of the loop.
>
>An option other than the one in my PEP draft would be to put 'del' at the end of
>the line instead of before EXPR:
>
>   for [VAR in] EXPR [del]:
>       BLOCK1
>   else:
>       BLOCK2
>
>However, as you say, 'del' isn't great for the purpose, but I was trying to
>avoid introduding yet another keyword. An obvious alternative is to use
>'finally' instead:
>
>   for [finally] [VAR in] EXPR:
>       BLOCK1
>   else:
>       BLOCK2
>
>It still doesn't read all that well, but at least the word more accurately
>reflects the semantics involved.

  If such a construct is to be introduced, the ideal spelling would seem to be:

    for [VAR in] EXPR:
        BLOCK1
    finally:
        BLOCK2

  Jp

From rrr at ronadam.com  Sun May  8 08:14:02 2005
From: rrr at ronadam.com (Ron Adam)
Date: Sun, 08 May 2005 02:14:02 -0400
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <427D92A8.7040508@gmail.com>
References: <427C2548.8010907@iinet.net.au>	<f18e9225b9d7fbc665008cc97f8d8283@xs4all.nl>	<20050507140547.64F0.JCARLSON@uci.edu>	<427D446B.3080707@ronadam.com>
	<427D92A8.7040508@gmail.com>
Message-ID: <427DAE2A.9050505@ronadam.com>

Nick Coghlan wrote:
> Ron Adam wrote:
> 
>>I agree, re-using or extending 'for' doesn't seem like a good idea to me.
> 
> 
> I agree that re-using a straight 'for' loop is out, due to performance and 
> compatibility issues with applying finalisation semantics to all such iterative 
> loops (there's a reason the PEP redraft doesn't suggest this).
> 
> However, it makes sense to me that a "for loop with finalisation" should 
> actually *be* a 'for' loop - just with some extra syntax to indicate that the 
> iterator is finalised at the end of the loop.

Question:  Is the 'for' in your case iterating over a sequence? or is it 
testing for an assignment to determine if it should continue?

The difference is slight I admit, and both views can be said to be true 
for 'for' loops iterating over lists also.  But maybe looking at it as a 
truth test of getting something instead of an iteration over a sequence 
would fit better?  When a variable to assign is not supplied then the 
test would be of a private continue-stop variable in the iterator or a 
StopIteration exception.


> However, as you say, 'del' isn't great for the purpose, but I was trying to 
> avoid introduding yet another keyword. 

I didn't say, that was Josiah, but I agree 'del' is not good.

>An obvious alternative is to use 
> 'finally' instead:
> 
>    for [finally] [VAR in] EXPR:
>        BLOCK1
>    else:
>        BLOCK2
> 
> It still doesn't read all that well, but at least the word more accurately 
> reflects the semantics involved.

How about:

      <keyword?> [VAR from] EXPR:

Could 'from' be reused in this context?

If the keyword chosen is completely different from 'for' or 'while', 
then it doesn't need a 'del' or 'finally' as that can be part of the new 
definition of whatever keyword is chosen.

I suggested reusing 'while' a few days ago because it fit the situation 
well, but come to the conclusion reusing either 'for' or 'while' should 
both be avoided.

So you might consider 'do', Guido responded with the following the other 
day:

#quote

 >[Greg Ewing]

 >> How about 'do'?
 >>
 >>    do opening(filename) as f:
 >>      ...
 >>
 >>    do locking(obj):
 >>      ...
 >>
 >>    do carefully(): #  :-)
 >>      ...

I've been thinking of that too. It's short, and in a nostalgic way
conveys that it's a loop, without making it too obvious. (Those too
young to get that should Google for do-loop.  :-)

I wonder how many folks call their action methods do() though.

#endquote

So it's not been ruled out, or followed though with, as far as I know. 
And I think it will work for both looping and non looping situations.


> The last option is to leave finalisation out of the 'for' loop syntax, and 
> introduce a user defined statement to handle the finalisation:

Yes, leaving it out of 'for' loop syntax is good.

I don't have an opinion on user defined statements yet.  But I think 
they would be somewhat slower than a built in block that does the same 
thing.  Performance will be an issue because these things will be nested 
and possibly quite deeply.

>>I wonder how much effect adding, 'for-next' and the 'StopIteration' 
>>exception check as proposed in PEP340, will have on 'for''s performance.
> 
> I'm not sure what you mean here - 'for' loops already use a StopIteration raised 
> by the iterator to indicate that the loop is complete. The code you posted can't 
> work, since it also intercepts a StopIteration raised in the body of the loop.

Oops, meant that to say 'for-else' above ...

The 'else' is new isn't it?  I was thinking that putting a try-except 
around the loop does the same thing as the else.  Unless I misunderstand 
it's use.

But you are right, it wouldn't work if the loop catches the StopIteration.


>>I think a completely separate looping or non-looping construct would be 
>>better for the finalization issue, and maybe can work with class's with 
>>__exit__ as well as generators.
> 
> 
> The PEP redraft already proposes a non-looping version as a new statement. 
> However, since generators are likely to start using the new non-looping 
> statement, it's important to be able to ensure timely finalisation of normal 
> iterators as well. 

Huh?  I thought a normal iterator or generator doesn't need 
finalization?  If it does, then it's not normal.  Has a word been coined 
for iterators with try-finally's in them yet?

Ron_Adam  :-)



From rrr at ronadam.com  Sun May  8 08:52:33 2005
From: rrr at ronadam.com (Ron Adam)
Date: Sun, 08 May 2005 02:52:33 -0400
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <20050507210108.64F4.JCARLSON@uci.edu>
References: <20050507140547.64F0.JCARLSON@uci.edu>
	<427D446B.3080707@ronadam.com>
	<20050507210108.64F4.JCARLSON@uci.edu>
Message-ID: <427DB731.8060604@ronadam.com>

Josiah Carlson wrote:

> For is already tuned to be as fast as possible, which makes sense; it is
> used 4,523 times in Python 2.4.0's standard library, and easily hundreds
> of thousands of times in user code.  Changing the standard for loop is
> not to be done lightly.

Agreed!

>>And why this isn't just as good:
>>
>>     try:
>>         for value in iterator:
>>             BLOCK1
>>     except StopIteration:
>>         BLOCK2
>>
>>Is one extra line that bad?
> 
> 
> I don't know what line you are referring to.

Was referring to the 'try'., the 'except' would be in place of the else.

Nick pointed out this wouldn't work as the 'for' already catches the 
StopIteration exception.


>>I think a completely separate looping or non-looping construct would be 
>>better for the finalization issue, and maybe can work with class's with 
>>__exit__ as well as generators.
> 
> From what I understand, the entire conversation has always stated that
> class-based finalized objects and generator-based finalized objects will
> both work, and that any proposal that works for one, but not the other,
> is not sufficient.

That's good to hear.  There seems to be some confusion as to weather or 
not 'for's will do finalizing.  So I was trying to stress I think 
regular 'for' loops should not finalize. They should probably give an 
error if an object with an try-finally in them or an __exit__ method. 
I'm not sure what the current opinion on that is.  But I didn't see it 
in any of the PEPs.

>>Having it loop has the advantage of making it break out in a better 
>>behaved way.
> 
> What you have just typed is nonsense.  Re-type it and be explicit.

It was a bit brief, sorry about that. :-)

To get a non-looping block to loop, you will need to put it in a loop or 
put a loop in it.

In the first case, doing a 'break' in the block doesn't exit the loop. 
so you need to add an extra test for that.

In the second case, doing a 'break' in the loop does exit the block, but 
finishes any code after the loop.  So you may need an extra case in that 
case.

Having a block that loops can simplify these conditions, in that a break 
alway exits the body of the block and stops the loop.  A 'continue' can 
be used to skip the end of the block and start the next loop early.

And you still have the option to put the block in a loop or loops in the 
block and they will work as they do now.

I hope that clarifies what I was thinking a bit better.


Ron_Adam




























From martin at v.loewis.de  Sun May  8 10:53:11 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sun, 08 May 2005 10:53:11 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427D0BDB.6050802@egenix.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>	<427BF842.8060604@hathawaymix.org>
	<427C089F.2010808@v.loewis.de>	<427C76D6.4050409@hathawaymix.org>
	<427CC3E3.4090405@v.loewis.de> <427D024B.6080207@hathawaymix.org>
	<427D0BDB.6050802@egenix.com>
Message-ID: <427DD377.6040401@v.loewis.de>

M.-A. Lemburg wrote:
> Unicode has many code points that are meant only for composition
> and don't have any standalone meaning, e.g. a combining acute
> accent (U+0301), yet they are perfectly valid code points -
> regardless of UCS-2 or UCS-4. It is easily possible to break
> such a combining sequence using slicing, so the most
> often presented argument for using UCS-4 instead of UCS-2
> (+ surrogates) is rather weak if seen by daylight.

I disagree. It is not just about slicing, it is also about
searching for a character (either through the "in" operator,
or through regular expressions). If you define an SRE character
class, such a character class cannot hold a non-BMP character
in UTF-16 mode, but it can in UCS-4 mode. Consequently,
implementing XML's lexical classes (such as Name, NCName, etc.)
is much easier in UCS-4 than it is in UCS-2. In this case,
combining characters do not matter much, because the XML
spec is defined in terms of Unicode coded characters, causing
combining characters to appear as separate entities for lexical
purposes (unlike half surrogates).

Regards,
Martin

From martin at v.loewis.de  Sun May  8 10:59:27 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sun, 08 May 2005 10:59:27 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427D0E80.4080502@egenix.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>	<427BCD3B.1000201@egenix.com>
	<427C029D.3090907@v.loewis.de> <427D0E80.4080502@egenix.com>
Message-ID: <427DD4EF.4030109@v.loewis.de>

M.-A. Lemburg wrote:
> I believe that it would be more appropriate to adjust the _tkinter
> module to adapt to the TCL Unicode size rather than
> forcing the complete Python system to adapt to TCL - I don't
> really see the point in an optional extension module
> defining the default for the interpreter core.

_tkinter currently supports, for a UCS-2 Tcl, both UCS-2 and UCS-4
Python. For an UCS-4 Tcl, it requires Python also to be UCS-4.
Contributions to support the missing case are welcome.

> At the very least, this should be a user controlled option.

It is: by passing --enable-unicode=ucs2, you can force Python
to use UCS-2 even if Tcl is UCS-4, with the result that
_tkinter cannot be built anymore (and compilation fails
with an #error).

> Otherwise, we might as well use sizeof(wchar_t) as basis
> for the default Unicode size. This at least, would be
> a much more reasonable choice than whatever TCL uses.

The goal of the build process is to provide as many extension
modules as possible (given the set of headers and libraries
installed), and _tkinter is an important extension module
because IDLE depends on it.

Regards,
Martin

From martin at v.loewis.de  Sun May  8 11:04:51 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sun, 08 May 2005 11:04:51 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <7d4d7235fa0855a51c733bd622e264cb@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<ed6dac3aa136985e60159713ff1d75ac@opnet.com>	<427B1BD6.1060206@egenix.com>	<d6016a11df6560db06fc5184e6a873bc@opnet.com>	<427C0194.3000008@v.loewis.de>
	<2ee14995c7e9d4fb650fbe3844d35dcb@opnet.com>
	<427C0B13.1090502@v.loewis.de>
	<73d0c1d776eec4a5dee10b0e09990184@opnet.com>
	<427CC1A1.4080206@v.loewis.de>
	<7d4d7235fa0855a51c733bd622e264cb@opnet.com>
Message-ID: <427DD633.8040301@v.loewis.de>

Nicholas Bastin wrote:
> I don't consider either alternative useless (well, I consider UCS-2 to
> be largely useless in the general case, but as we've already discussed
> here, Python isn't really UCS-2).  However, I would be a lot happier if
> we just chose *one*, and all Python's used that one.  This would make
> extension module distribution a lot easier.

Why is that? For a binary distribution, you have to know the target
system in advance, so you also know what size the Unicode type has.
For example, on Redhat 9.x, and on Debian Sarge, /usr/bin/python
uses a UCS-4 Unicode type. As you have to build binaries specifically
for these target systems (because of dependencies on the C library,
and perhaps other libraries), building the extension module *on*
the target system will just do the right thing.

> I'd prefer UTF-16, but I would be perfectly happy with UCS-4.

-1 on the idea of dropping one alternative. They are both used
(on different systems), and people rely on both being supported.

Regards,
Martin

From jcarlson at uci.edu  Sun May  8 11:11:36 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Sun, 08 May 2005 02:11:36 -0700
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <427DB731.8060604@ronadam.com>
References: <20050507210108.64F4.JCARLSON@uci.edu>
	<427DB731.8060604@ronadam.com>
Message-ID: <20050508011954.64F7.JCARLSON@uci.edu>


Ron Adam <rrr at ronadam.com> wrote:
> 
> Josiah Carlson wrote:
> >>I think a completely separate looping or non-looping construct would be 
> >>better for the finalization issue, and maybe can work with class's with 
> >>__exit__ as well as generators.
> > 
> > From what I understand, the entire conversation has always stated that
> > class-based finalized objects and generator-based finalized objects will
> > both work, and that any proposal that works for one, but not the other,
> > is not sufficient.
> 
> That's good to hear.  There seems to be some confusion as to weather or 
> not 'for's will do finalizing.  So I was trying to stress I think 
> regular 'for' loops should not finalize. They should probably give an 
> error if an object with an try-finally in them or an __exit__ method. 
> I'm not sure what the current opinion on that is.  But I didn't see it 
> in any of the PEPs.

It's not a matter of 'will they be finalized', but instead a matter of
'will they be finalized in a timely manner'.  From what I understand;
upon garbage collection, any generator-based resource will be finalized
via __exit__/next(exception)/... and any class-based resource will have
its __del__ method called (as long as it is well-behaved), which can be
used to call __exit__...


> >>Having it loop has the advantage of making it break out in a better 
> >>behaved way.
> > 
> > What you have just typed is nonsense.  Re-type it and be explicit.
> 
> It was a bit brief, sorry about that. :-)
> 
> To get a non-looping block to loop, you will need to put it in a loop or 
> put a loop in it.
> 
> In the first case, doing a 'break' in the block doesn't exit the loop. 
> so you need to add an extra test for that.
> 
> In the second case, doing a 'break' in the loop does exit the block, but 
> finishes any code after the loop.  So you may need an extra case in that 
> case.
> 
> Having a block that loops can simplify these conditions, in that a break 
> alway exits the body of the block and stops the loop.  A 'continue' can 
> be used to skip the end of the block and start the next loop early.
> 
> And you still have the option to put the block in a loop or loops in the 
> block and they will work as they do now.
> 
> I hope that clarifies what I was thinking a bit better.


That is the long-standing nested loops 'issue', which is not going to be
solved here, nor should it be.

I am not sure that any solution to the issue will be sufficient for
everyone involved.  The closest thing to a generic solution I can come
up with would be to allow for the labeling of for/while loops, and the
allowing of "break/continue <label>", which continues to that loop
(breaking all other loops currently nested within), or breaks that loop
(as well as all other loops currently nested within).

Perhaps something like...

while ... label 'foo':
    for ... in ... label 'goo':
        block ... label 'hoo':
            if ...:
                #equivalent to continue 'hoo'
                continue
            elif ...:
                continue 'goo'
            elif ...:
                continue 'foo'
            else:
                break 'foo'

Does this solve the nested loop problem?  Yes.  Do I like it?  Not
really; three keywords in a single for/block statement is pretty awful.
On the upside, 'label' doesn't need to be a full-on keyword (it can be a
partial keyword like 'as' still seems to be).

Enough out of me, good night,
 - Josiah


From martin at v.loewis.de  Sun May  8 11:15:45 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sun, 08 May 2005 11:15:45 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
	<a39457493d9660bee8d1ace89067c990@opnet.com>
	<427C07C5.7060106@v.loewis.de>
	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>
	<427CC2C2.60000@v.loewis.de>
	<a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>
Message-ID: <427DD8C1.4060109@v.loewis.de>

Nicholas Bastin wrote:
>> -1. This breaks existing documentation and usage, and provides only
>> minimum value.
> 
> 
> Have you been missing this conversation?  UTF-16 is *WHAT PYTHON
> CURRENTLY IMPLEMENTS*.  The current documentation is flat out wrong. 
> Breaking that isn't a big problem in my book.

The documentation I refer to is the one that says the equivalent of

'configure takes an option --enable-unicode, with the possible
values "ucs2", "ucs4", "yes" (equivalent to no argument),
and  "no" (equivalent to --disable-unicode)'

*THIS* documentation would break. This documentation is factually
correct at the moment (configure does indeed take these options),
and people rely on them in automatic build processes. Changing
configure options should not be taken lightly, even if they
may result from a "wrong mental model". By that rule, --with-suffix
should be renamed to --enable-suffix, --with-doc-strings to
--enable-doc-strings, and so on. However, the nitpicking that
underlies the desire to rename the option should be ignored
in favour of backwards compatibility.

Changing the documentation that goes along with the option
would be fine.

> It provides more than minimum value - it provides the truth.

No. It is just a command line option. It could be named
--enable-quirk=(quork|quark), and would still select UTF-16.
Command line options provide no truth - they don't even
provide statements.

>> With --enable-unicode=ucs2, Python's Py_UNICODE does *not* start
>> supporting the full Unicode ccs the same way it supports UCS-2.
> 
> I can't understand what you mean by this.  My point is that if you
> configure python to support UCS-2, then it SHOULD NOT support surrogate
> pairs.  Supporting surrogate paris is the purvey of variable width
> encodings, and UCS-2 is not among them.

So you suggest to renaming it to --enable-unicode=utf16, right?
My point is that a Unicode type with UTF-16 would correctly
support all assigned Unicode code points, which the current
2-byte implementation doesn't. So --enable-unicode=utf16 would
*not* be the truth.

Regards,
Martin

From martin at v.loewis.de  Sun May  8 11:28:01 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sun, 08 May 2005 11:28:01 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <f9c6d597427a414ee75e2cd3b683d09a@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427C07C5.7060106@v.loewis.de>	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>	<427CC2C2.60000@v.loewis.de>	<a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>	<427D2E92.9020303@egenix.com>
	<f9c6d597427a414ee75e2cd3b683d09a@opnet.com>
Message-ID: <427DDBA1.6070503@v.loewis.de>

Nicholas Bastin wrote:
> All of my proposals for what to change the documention to have been 
> shot down by Martin.  If someone has better verbiage that they'd like 
> to see, I'd be perfectly happy to patch the doc.

I don't look into the specific wording - you speak English much better
than I do. What I care about is that this part of the documentation
should be complete and precise. I.e. statements like "should not make
assumptions" might be fine, as long as they are still followed by
a precise description of what the code currently does. So it should
mention that the representation can be either 2 or 4 bytes, that
the strings "ucs2" and "ucs4" can be used to select one of them,
that it is always 2 bytes on Windows, that 2 bytes means that non-BMP
characters can be represented as surrogate pairs, and so on.

Regards,
Martin

From eric.nieuwland at xs4all.nl  Sun May  8 11:46:45 2005
From: eric.nieuwland at xs4all.nl (Eric Nieuwland)
Date: Sun, 8 May 2005 11:46:45 +0200
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <20050507140547.64F0.JCARLSON@uci.edu>
References: <427C2548.8010907@iinet.net.au>
	<f18e9225b9d7fbc665008cc97f8d8283@xs4all.nl>
	<20050507140547.64F0.JCARLSON@uci.edu>
Message-ID: <62c9d603ada1357bd7739d5a39f9946a@xs4all.nl>

Josiah Carlson wrote:
> Eric Nieuwland <eric.nieuwland at xs4all.nl> wrote:
>> I don't know. Using 'del' in that place seems ackward to me.
>> Why not use the following rule:
>> 	for [VAR in] EXPR:
>> 		SUITE
>> If EXPR is an iterator, no finalisation is done.
>> If EXPR is not an iterator, it is created at the start and destroyed 
>> at
>> the end of the loop.
>
> You should know why that can't work.  If I pass a list, is a list an
> iterator?  No, but it should neither be created nor destroyed before or
> after.

I suggested to create AN ITERATOR FOR THE LIST and destroy that at the 
end. The list itself remains untouched.

--eric


From ncoghlan at gmail.com  Sun May  8 13:20:44 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 08 May 2005 21:20:44 +1000
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <427DAE2A.9050505@ronadam.com>
References: <427C2548.8010907@iinet.net.au>	<f18e9225b9d7fbc665008cc97f8d8283@xs4all.nl>	<20050507140547.64F0.JCARLSON@uci.edu>	<427D446B.3080707@ronadam.com>	<427D92A8.7040508@gmail.com>
	<427DAE2A.9050505@ronadam.com>
Message-ID: <427DF60C.9020402@gmail.com>

Ron Adam wrote:
> Question:  Is the 'for' in your case iterating over a sequence? or is it 
> testing for an assignment to determine if it should continue?

Iterating over a sequence. If it's single-pass (and always single pass), you 
should use a user defined statement instead.

> The difference is slight I admit, and both views can be said to be true 
> for 'for' loops iterating over lists also.  But maybe looking at it as a 
> truth test of getting something instead of an iteration over a sequence 
> would fit better?  When a variable to assign is not supplied then the 
> test would be of a private continue-stop variable in the iterator or a 
> StopIteration exception.

No, that's the use case for user defined statements - if __enter__ raises 
TerminateBlock, then the body of the statement is not executed. What the 
for-loop part of the redrafted PEP is about is whether or not there should be an 
easy way to say "iterate over this iterator, and finalise it afterwards, 
regardless of how the iteration is ended", rather than having to use a 
try/finally block or a user defined statement for that purpose.

I think I need to reorder those two sections - introduce user-defined statements 
first, then consider whether or not to add direct finalisation support to for loops.

> If the keyword chosen is completely different from 'for' or 'while', 
> then it doesn't need a 'del' or 'finally' as that can be part of the new 
> definition of whatever keyword is chosen.

That's the technique suggested for the single-pass user defined statements. 
However, a 'for loop with finalisation' is *still fundamentally an iterative 
loop*, and the syntax should reflect that.

> So you might consider 'do', Guido responded with the following the other 
> day:
[snip quote from Guido]
> So it's not been ruled out, or followed though with, as far as I know. 
> And I think it will work for both looping and non looping situations.

The same keyword cannot be used for the looping vs non-looping construct, 
because of the effect on the semantics of break and continue statements.

The non-looping construct is the more fundamental of the two, since it can 
replace any current try/except/else/finally boilerplate, without any concern 
over whether or not the contained code using break or continue statements. A 
looping construct alters the meanings of those statements.

>>The last option is to leave finalisation out of the 'for' loop syntax, and 
>>introduce a user defined statement to handle the finalisation:
> 
> Yes, leaving it out of 'for' loop syntax is good.
> 
> I don't have an opinion on user defined statements yet.  But I think 
> they would be somewhat slower than a built in block that does the same 
> thing.

What do you mean by 'built in block'? The user defined statements of the PEP 
redraft are simply a non-looping version of PEP 340's anonymous block statements.

> Oops, meant that to say 'for-else' above ...
> 
> The 'else' is new isn't it?  I was thinking that putting a try-except 
> around the loop does the same thing as the else.  Unless I misunderstand 
> it's use.

No, the else clause on loops is a little known part of present day Python - it 
executes whenever the loop terminates naturally (i.e. not via a break statement).

The only thing PEP 340 adds to for loops is the semantics to handle an argument 
to continue statements - it adds nothing to do with finalisation. My PEP 
redraft, on the other hand, suggests the introduction of a 'for loop with 
finalisation' that works fairly similarly to PEP 340's anonymous block statements.

>>The PEP redraft already proposes a non-looping version as a new statement. 
>>However, since generators are likely to start using the new non-looping 
>>statement, it's important to be able to ensure timely finalisation of normal 
>>iterators as well. 
> 
> 
> Huh?  I thought a normal iterator or generator doesn't need 
> finalization?  If it does, then it's not normal.  Has a word been coined 
> for iterators with try-finally's in them yet?

An example was posted that looked like this:

   def all_lines(filenames):
       for name in filenames:
           stmt opening(name) as f:
               for line in f:
                   yield line

This is clearly intended for use as an iterator - it returns a bunch of lines. 
However, if the iterator is not finalised promptly, then the file that provided 
the last line may be left open indefinitely.

By making such an iterator easy to write, it behooves the PEP to make it easy to 
use correctly. This need *can* be met by the 'consuming' user defined statement 
I posted earlier, but a more elegant solution is to be able to iterate over this 
generator normally, while also being able to ask Python to ensure the generator 
is finalised at the end of the iteration.

Regards,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From p.f.moore at gmail.com  Sun May  8 13:54:29 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Sun, 8 May 2005 12:54:29 +0100
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <20050508043201.15422.945259482.divmod.quotient.25206@ohm>
References: <427D92A8.7040508@gmail.com>
	<20050508043201.15422.945259482.divmod.quotient.25206@ohm>
Message-ID: <79990c6b0505080454335d413@mail.gmail.com>

On 5/8/05, Jp Calderone <exarkun at divmod.com> wrote:
>   If such a construct is to be introduced, the ideal spelling would seem to be:
> 
>     for [VAR in] EXPR:
>         BLOCK1
>     finally:
>         BLOCK2

While I have not been following this discussion at all (I don't have
the energy or time to follow the development of yet another proposal -
I'll wait for the PEP) this does read more naturally to me than any of
the other contortions I've seen passing by.

Paul.

From ncoghlan at gmail.com  Sun May  8 14:21:22 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 08 May 2005 22:21:22 +1000
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <79990c6b0505080454335d413@mail.gmail.com>
References: <427D92A8.7040508@gmail.com>	<20050508043201.15422.945259482.divmod.quotient.25206@ohm>
	<79990c6b0505080454335d413@mail.gmail.com>
Message-ID: <427E0442.4040707@gmail.com>

Paul Moore wrote:
> On 5/8/05, Jp Calderone <exarkun at divmod.com> wrote:
> 
>>  If such a construct is to be introduced, the ideal spelling would seem to be:
>>
>>    for [VAR in] EXPR:
>>        BLOCK1
>>    finally:
>>        BLOCK2
> 
> 
> While I have not been following this discussion at all (I don't have
> the energy or time to follow the development of yet another proposal -
> I'll wait for the PEP) this does read more naturally to me than any of
> the other contortions I've seen passing by.

Given this for loop syntax:

   for VAR in EXPR:
       BLOCK1
   else:
       BLOCK2
   finally:
       BLOCK3

And these semantics when a finally block is present:

   itr = iter(EXPR1)
   exhausted = False
   try:
       while True:
           try:
               VAR1 = itr.next()
           except StopIteration:
               exhausted = True
               break
           BLOCK1
       if exhausted:
           BLOCK2
   finally:
       try:
           BLOCK3
       finally:
           itr_exit = getattr(itr, "__exit__", None)
           if itr_exit is not None:
               try:
                   itr.__exit__(TerminateBlock)
               except TerminateBlock:
                   pass

"Loop on this iterator and finalise when done" would be written:

   for item in itr:
       process(item)
   finally:
       pass

If you just want the finally clause, without finalising the iterator, you write 
it as you would now:

   try:
       for item in itr:
           process(item)
   finally:
       finalisation()

I like it - I'll update the PEP redraft to use it instead of the 'del' idea.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From ncoghlan at gmail.com  Sun May  8 16:10:55 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 09 May 2005 00:10:55 +1000
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <427C2548.8010907@iinet.net.au>
References: <427C2548.8010907@iinet.net.au>
Message-ID: <427E1DEF.10201@gmail.com>

Nick Coghlan wrote:
> The whole PEP draft can be found here:
> http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

I've updated this based on the feedback so far. The biggest change is that I've 
dropped the 'del' idea in favour of an optional 'finally' clause on for loops 
that finalises the iterator in addition to executing the code contained in the 
clause.

I also added additional description of the purpose of user defined statements 
(factoring out exception handling boilerplate that is not easily factored into a 
separate function), and fixed the semantics so that __exit__() is called without 
an argument when the statement exits cleanly (previously, a template could not 
tell if the statement exited cleanly or not).

I expanded on the generator section, indicating that the __exit__ method simply 
invokes next() if no exception is passed in (this makes the transaction example 
work correctly).

I updated the auto_retry example to work with the new for loop finalisation 
approach, and added an example (reading the lines from multiple named files) 
where timely iterator finalisation is needed.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From rrr at ronadam.com  Sun May  8 16:59:40 2005
From: rrr at ronadam.com (Ron Adam)
Date: Sun, 08 May 2005 10:59:40 -0400
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <20050508011954.64F7.JCARLSON@uci.edu>
References: <20050507210108.64F4.JCARLSON@uci.edu>
	<427DB731.8060604@ronadam.com>
	<20050508011954.64F7.JCARLSON@uci.edu>
Message-ID: <427E295C.3070708@ronadam.com>

Josiah Carlson wrote:
> Ron Adam <rrr at ronadam.com> wrote:
> 
>>Josiah Carlson wrote:
>>
>>>>I think a completely separate looping or non-looping construct would be 
>>>>better for the finalization issue, and maybe can work with class's with 
>>>>__exit__ as well as generators.
>>>
>>>From what I understand, the entire conversation has always stated that
>>>class-based finalized objects and generator-based finalized objects will
>>>both work, and that any proposal that works for one, but not the other,
>>>is not sufficient.
>>
>>That's good to hear.  There seems to be some confusion as to weather or 
>>not 'for's will do finalizing.  So I was trying to stress I think 
>>regular 'for' loops should not finalize. They should probably give an 
>>error if an object with an try-finally in them or an __exit__ method. 
>>I'm not sure what the current opinion on that is.  But I didn't see it 
>>in any of the PEPs.
> 
> 
> It's not a matter of 'will they be finalized', but instead a matter of
> 'will they be finalized in a timely manner'.  From what I understand;
> upon garbage collection, any generator-based resource will be finalized
> via __exit__/next(exception)/... and any class-based resource will have
> its __del__ method called (as long as it is well-behaved), which can be
> used to call __exit__...

I should have said  "...should not finalize at the end of the for loop". 
  With generators, you may not want them to finalize before you are done 
with them, and the same with class's.


>>>>Having it loop has the advantage of making it break out in a better 
>>>>behaved way.
>>>
>>>What you have just typed is nonsense.  Re-type it and be explicit.
>>
>>It was a bit brief, sorry about that. :-)
>>
>>To get a non-looping block to loop, you will need to put it in a loop or 
>>put a loop in it.
>>
>>In the first case, doing a 'break' in the block doesn't exit the loop. 
>>so you need to add an extra test for that.
>>
>>In the second case, doing a 'break' in the loop does exit the block, but 
>>finishes any code after the loop.  So you may need an extra case in that 
>>case.
>>
>>Having a block that loops can simplify these conditions, in that a break 
>>alway exits the body of the block and stops the loop.  A 'continue' can 
>>be used to skip the end of the block and start the next loop early.
>>
>>And you still have the option to put the block in a loop or loops in the 
>>block and they will work as they do now.
>>
>>I hope that clarifies what I was thinking a bit better.
> 
> 
> 
> That is the long-standing nested loops 'issue', which is not going to be
> solved here, nor should it be.

We may not find a solution today, but where should it be addressed if 
not here?

I don't really see the general issue of breaking out of loops as a 
problem, but was just addressing where it overlaps blocks and weather or 
not blocks should loop.

> I am not sure that any solution to the issue will be sufficient for
> everyone involved. 

That's the nature of programming in general isn't it. ;-)


> The closest thing to a generic solution I can come
> up with would be to allow for the labeling of for/while loops, and the
> allowing of "break/continue <label>", which continues to that loop
> (breaking all other loops currently nested within), or breaks that loop
> (as well as all other loops currently nested within).
 >
> Perhaps something like...
> 
> while ... label 'foo':
>     for ... in ... label 'goo':
>         block ... label 'hoo':
>             if ...:
>                 #equivalent to continue 'hoo'
>                 continue
>             elif ...:
>                 continue 'goo'
>             elif ...:
>                 continue 'foo'
>             else:
>                 break 'foo'
> 
> Does this solve the nested loop problem?  Yes.  Do I like it?  Not
> really; three keywords in a single for/block statement is pretty awful.
> On the upside, 'label' doesn't need to be a full-on keyword (it can be a
> partial keyword like 'as' still seems to be).

How about this for breaking out of all loops at once.

class BreakLoop(Exception):
     """break out of nested loops"""

try:
     for x in range(100):
         for y in range(100):
             for z in range(100):
             	if x == 25 and y==72 and z==3:
                     raise BreakLoop

except BreakLoop: pass
print 'x,y,z =', x,y,z


Sometimes I would like a "try until <exception>:"  for cases like this 
where you would use "except <exception>:pass".

Cheers,
Ron_Adam






From shane at hathawaymix.org  Sun May  8 18:25:07 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Sun, 08 May 2005 10:25:07 -0600
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427D0BDB.6050802@egenix.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>	<427BF842.8060604@hathawaymix.org>
	<427C089F.2010808@v.loewis.de>	<427C76D6.4050409@hathawaymix.org>
	<427CC3E3.4090405@v.loewis.de> <427D024B.6080207@hathawaymix.org>
	<427D0BDB.6050802@egenix.com>
Message-ID: <427E3D63.2070301@hathawaymix.org>

M.-A. Lemburg wrote:
> All this talk about UTF-16 vs. UCS-2 is not very useful
> and strikes me a purely academic.
> 
> The reference to possibly breakage by slicing a Unicode and
> breaking a surrogate pair is valid, the idea of UCS-4 being
> less prone to breakage is a myth:

Fair enough.  The original point is that the documentation is unclear
about what a Py_UNICODE[] contains.  I deduced that it contains either
UCS2 or UCS4 and implemented accordingly.  Not only did I guess wrong,
but others will probably guess wrong too.  Something in the docs needs
to spell this out.

Shane

From jcarlson at uci.edu  Sun May  8 18:32:03 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Sun, 08 May 2005 09:32:03 -0700
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <427E295C.3070708@ronadam.com>
References: <20050508011954.64F7.JCARLSON@uci.edu>
	<427E295C.3070708@ronadam.com>
Message-ID: <20050508090344.64FA.JCARLSON@uci.edu>


Ron Adam <rrr at ronadam.com> wrote:
> Josiah Carlson wrote:
> > It's not a matter of 'will they be finalized', but instead a matter of
> > 'will they be finalized in a timely manner'.  From what I understand;
> > upon garbage collection, any generator-based resource will be finalized
> > via __exit__/next(exception)/... and any class-based resource will have
> > its __del__ method called (as long as it is well-behaved), which can be
> > used to call __exit__...
> 
> I should have said  "...should not finalize at the end of the for loop". 
>   With generators, you may not want them to finalize before you are done 
> with them, and the same with class's.

So you don't use them with a structure that greedily finalizes, and you
keep a reference to the object exterior to the loop.  Seems to be a
non-issue.


> > That is the long-standing nested loops 'issue', which is not going to be
> > solved here, nor should it be.
> 
> We may not find a solution today, but where should it be addressed if 
> not here?
> 
> I don't really see the general issue of breaking out of loops as a 
> problem, but was just addressing where it overlaps blocks and weather or 
> not blocks should loop.

The argument over whether blocks should loop, I believe has been had;
they should.  The various use cases involve multi-part transactions and
such.


> > The closest thing to a generic solution I can come
> > up with would be to allow for the labeling of for/while loops, and the
> > allowing of "break/continue <label>", which continues to that loop
> > (breaking all other loops currently nested within), or breaks that loop
> > (as well as all other loops currently nested within).
>  >
> > Perhaps something like...
> > 
> > while ... label 'foo':
> >     for ... in ... label 'goo':
> >         block ... label 'hoo':
> >             if ...:
> >                 #equivalent to continue 'hoo'
> >                 continue
> >             elif ...:
> >                 continue 'goo'
> >             elif ...:
> >                 continue 'foo'
> >             else:
> >                 break 'foo'
> > 
> > Does this solve the nested loop problem?  Yes.  Do I like it?  Not
> > really; three keywords in a single for/block statement is pretty awful.
> > On the upside, 'label' doesn't need to be a full-on keyword (it can be a
> > partial keyword like 'as' still seems to be).
> 
> How about this for breaking out of all loops at once.
> 
> class BreakLoop(Exception):
>      """break out of nested loops"""
> 
> try:
>      for x in range(100):
>          for y in range(100):
>              for z in range(100):
>              	if x == 25 and y==72 and z==3:
>                      raise BreakLoop
> 
> except BreakLoop: pass
> print 'x,y,z =', x,y,z
> 
> 
> Sometimes I would like a "try until <exception>:"  for cases like this 
> where you would use "except <exception>:pass".


That is a mechanism, but I like it even less than the one I offered. 
Every time that one wants ot offer themselves the ability to break out
of a different loop (no continue here), one must create another
try/except clause, further indenting, and causing nontrivial try/except
overhead inside nested loops.

A real solution to the problem should (in my opinion) allow the breaking
of or continuing to an arbitrary for/while/block.  Humorously enough,
Richie Hindle's goto/comefrom statements for Python ("not to be used in
production code") would allow 90% of the necessary behavior (though the
lack of timely finalization would probably annoy some people, but then
again, there is only so much one can expect from a module written as a
working April Fools joke over a year ago).

 - Josiah


From jcarlson at uci.edu  Sun May  8 18:33:23 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Sun, 08 May 2005 09:33:23 -0700
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <62c9d603ada1357bd7739d5a39f9946a@xs4all.nl>
References: <20050507140547.64F0.JCARLSON@uci.edu>
	<62c9d603ada1357bd7739d5a39f9946a@xs4all.nl>
Message-ID: <20050508093239.64FD.JCARLSON@uci.edu>


Eric Nieuwland <eric.nieuwland at xs4all.nl> wrote:
> I suggested to create AN ITERATOR FOR THE LIST and destroy that at the 
> end. The list itself remains untouched.

My mistake, I did not understand your use of pronouns.


 - Josiah


From martin at v.loewis.de  Sun May  8 19:44:51 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Sun, 08 May 2005 19:44:51 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427E3D63.2070301@hathawaymix.org>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>	<427BF842.8060604@hathawaymix.org>
	<427C089F.2010808@v.loewis.de>	<427C76D6.4050409@hathawaymix.org>
	<427CC3E3.4090405@v.loewis.de> <427D024B.6080207@hathawaymix.org>
	<427D0BDB.6050802@egenix.com> <427E3D63.2070301@hathawaymix.org>
Message-ID: <427E5013.5020308@v.loewis.de>

Shane Hathaway wrote:
> Fair enough.  The original point is that the documentation is unclear
> about what a Py_UNICODE[] contains.  I deduced that it contains either
> UCS2 or UCS4 and implemented accordingly.  Not only did I guess wrong,
> but others will probably guess wrong too.  Something in the docs needs
> to spell this out.

Again, patches are welcome. I was opposed to Nick's proposed changes,
since they explicitly said that you are not supposed to know what
is in a Py_UNICODE. Integrating the essence of PEP 261 into the
main documentation would be a worthwhile task.

Regards,
Martin


From nbastin at opnet.com  Sun May  8 20:40:40 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Sun, 8 May 2005 14:40:40 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427DD8C1.4060109@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
	<a39457493d9660bee8d1ace89067c990@opnet.com>
	<427C07C5.7060106@v.loewis.de>
	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>
	<427CC2C2.60000@v.loewis.de>
	<a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>
	<427DD8C1.4060109@v.loewis.de>
Message-ID: <2bd4e2b88b94f48297ff0dcaef97a7c0@opnet.com>


On May 8, 2005, at 5:15 AM, Martin v. L?wis wrote:

> 'configure takes an option --enable-unicode, with the possible
> values "ucs2", "ucs4", "yes" (equivalent to no argument),
> and  "no" (equivalent to --disable-unicode)'
>
> *THIS* documentation would break. This documentation is factually
> correct at the moment (configure does indeed take these options),
> and people rely on them in automatic build processes. Changing
> configure options should not be taken lightly, even if they
> may result from a "wrong mental model". By that rule, --with-suffix
> should be renamed to --enable-suffix, --with-doc-strings to
> --enable-doc-strings, and so on. However, the nitpicking that
> underlies the desire to rename the option should be ignored
> in favour of backwards compatibility.
>
> Changing the documentation that goes along with the option
> would be fine.

That is exactly what I proposed originally, which you shot down.  
Please actually read the contents of my messages.  What I said was 
"change the configure option and related documentation".


>> It provides more than minimum value - it provides the truth.
>
> No. It is just a command line option. It could be named
> --enable-quirk=(quork|quark), and would still select UTF-16.
> Command line options provide no truth - they don't even
> provide statements.

Wow, what an inane way of looking at it.  I don't know what world you 
live in, but in my world, users read the configure options and suppose 
that they mean something.  In fact, they *have* to go off on their own 
to assume something, because even the documentation you refer to above 
doesn't say what happens if they choose UCS-2 or UCS-4.  A logical 
assumption would be that python would use those CEFs internally, and 
that would be incorrect.

>>> With --enable-unicode=ucs2, Python's Py_UNICODE does *not* start
>>> supporting the full Unicode ccs the same way it supports UCS-2.
>>
>> I can't understand what you mean by this.  My point is that if you
>> configure python to support UCS-2, then it SHOULD NOT support 
>> surrogate
>> pairs.  Supporting surrogate paris is the purvey of variable width
>> encodings, and UCS-2 is not among them.
>
> So you suggest to renaming it to --enable-unicode=utf16, right?
> My point is that a Unicode type with UTF-16 would correctly
> support all assigned Unicode code points, which the current
> 2-byte implementation doesn't. So --enable-unicode=utf16 would
> *not* be the truth.

The current implementation supports the UTF-16 CEF.  i.e., it supports 
a variable width encoding form capable of representing all of the 
unicode space using surrogate pairs.  Please point out a code point 
that the current 2 byte implementation does not support, either 
directly, or through the use of surrogate pairs.

--
Nick


From nbastin at opnet.com  Sun May  8 20:44:00 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Sun, 8 May 2005 14:44:00 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427DDBA1.6070503@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427C07C5.7060106@v.loewis.de>	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>	<427CC2C2.60000@v.loewis.de>	<a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>	<427D2E92.9020303@egenix.com>
	<f9c6d597427a414ee75e2cd3b683d09a@opnet.com>
	<427DDBA1.6070503@v.loewis.de>
Message-ID: <739269ab69ce041ca53dade76ad563f9@opnet.com>


On May 8, 2005, at 5:28 AM, Martin v. L?wis wrote:

> Nicholas Bastin wrote:
>> All of my proposals for what to change the documention to have been
>> shot down by Martin.  If someone has better verbiage that they'd like
>> to see, I'd be perfectly happy to patch the doc.
>
> I don't look into the specific wording - you speak English much better
> than I do. What I care about is that this part of the documentation
> should be complete and precise. I.e. statements like "should not make
> assumptions" might be fine, as long as they are still followed by
> a precise description of what the code currently does. So it should
> mention that the representation can be either 2 or 4 bytes, that
> the strings "ucs2" and "ucs4" can be used to select one of them,
> that it is always 2 bytes on Windows, that 2 bytes means that non-BMP
> characters can be represented as surrogate pairs, and so on.

It's not always 2 bytes on Windows.  Users can alter the config options 
(and not unreasonably so, btw, on 64-bit windows platforms).

This goes to the issue that I think people don't understand that we 
have to assume that some users will build their own Python.  This will 
result in 2-byte Python's on RHL9, and 4-byte python's on windows, both 
of which have already been claimed in this discussion to not happen, 
which is untrue.  You can't build a binary extension module on windows 
and assume that Py_UNICODE is 2 bytes, because that's not enforced in 
any way.  The same is true for 4-byte Py_UNICODE on RHL9.

--
Nick


From nbastin at opnet.com  Sun May  8 20:46:10 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Sun, 8 May 2005 14:46:10 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427E5013.5020308@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>	<427BF842.8060604@hathawaymix.org>
	<427C089F.2010808@v.loewis.de>	<427C76D6.4050409@hathawaymix.org>
	<427CC3E3.4090405@v.loewis.de> <427D024B.6080207@hathawaymix.org>
	<427D0BDB.6050802@egenix.com> <427E3D63.2070301@hathawaymix.org>
	<427E5013.5020308@v.loewis.de>
Message-ID: <bea778d4690a6e73455f7908fa009c8c@opnet.com>


On May 8, 2005, at 1:44 PM, Martin v. L?wis wrote:

> Shane Hathaway wrote:
>> Fair enough.  The original point is that the documentation is unclear
>> about what a Py_UNICODE[] contains.  I deduced that it contains either
>> UCS2 or UCS4 and implemented accordingly.  Not only did I guess wrong,
>> but others will probably guess wrong too.  Something in the docs needs
>> to spell this out.
>
> Again, patches are welcome. I was opposed to Nick's proposed changes,
> since they explicitly said that you are not supposed to know what
> is in a Py_UNICODE. Integrating the essence of PEP 261 into the
> main documentation would be a worthwhile task.

You can't possibly assume you know specifically what's in a Py_UNICODE 
in any given python installation.  If someone thinks this statement is 
untrue, please explain why.

I realize you might not *want* that to be true, but it is.  Users are 
free to configure their python however they desire, and if that means 
--enable-unicode=ucs2 on RH9, then that is perfectly valid.

--
Nick


From rrr at ronadam.com  Sun May  8 21:23:50 2005
From: rrr at ronadam.com (Ron Adam)
Date: Sun, 08 May 2005 15:23:50 -0400
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <427DF60C.9020402@gmail.com>
References: <427C2548.8010907@iinet.net.au>	<f18e9225b9d7fbc665008cc97f8d8283@xs4all.nl>	<20050507140547.64F0.JCARLSON@uci.edu>	<427D446B.3080707@ronadam.com>	<427D92A8.7040508@gmail.com>	<427DAE2A.9050505@ronadam.com>
	<427DF60C.9020402@gmail.com>
Message-ID: <427E6746.6070707@ronadam.com>

Nick Coghlan wrote:


> Iterating over a sequence. If it's single-pass (and always single pass), you 
> should use a user defined statement instead.

> That's the technique suggested for the single-pass user defined statements. 
> However, a 'for loop with finalisation' is *still fundamentally an iterative 
> loop*, and the syntax should reflect that.

<in responce to do>
> The same keyword cannot be used for the looping vs non-looping construct, 
> because of the effect on the semantics of break and continue statements.

I disagree with this, I think 'do' would work very well for both single 
pass, and multiple pass, blocks.

In this example 'do' evaluates as True until the generator ends without 
returning a value:

def open_file(name,mode):
     f = open(name,mode)
     try:
         yield f
     finally:
         f.close()

Do f from open_file(name,mode):
    for line in f:
    	print line.rstrip()

On the first try, it gets f, so the do expression evaluates as True and 
the BLOCK is run.

On the second try, instead of getting a value, the finally suite is 
executed and the generator ends, causing the do expression to evaluate 
as False.

If a continue is used, it just skips the end of the 'do' body, and then 
weather or not to loop is determined by weather or not the 'do 
expression evaluates as True or not.

A break skips the rest of the 'do' body and execute the generators 
finally.

This works the same in both single pass and multi pass situations.

The difference is by using a truth test instead of iterating, it better 
represents what is happening and opens up a few options.

There's also the possibility to use conditional looping based on the 
value returned from the generator.

do VAR from EXPR if VAR==CONST:
    BLOCK

This is a bit verbose, but it reads well. :-)

But that is really just a short cut for:

do VAR from EXPR:
     if VAR != CONST:
         break
     BLOCK


The Syntax might be:

    do ([VAR from] EXPR1) | (VAR from EXPR1 if EXPR2): BODY


>>I don't have an opinion on user defined statements yet.  But I think 
>>they would be somewhat slower than a built in block that does the same 
>>thing.
>  
> What do you mean by 'built in block'? The user defined statements of the PEP 
> redraft are simply a non-looping version of PEP 340's anonymous block statements.

Ok, my mistake, I thought you were suggesting the more general user 
defined statements suggested elsewhere.


> No, the else clause on loops is a little known part of present day Python - it 
> executes whenever the loop terminates naturally (i.e. not via a break statement).

Hmm... ok, and the opposite of what I expected.  No wonder its a little 
known part.


> My PEP redraft, on the other hand, suggests the introduction of a 'for loop with 
> finalisation' that works fairly similarly to PEP 340's anonymous block statements.

Here is my current thinking.  It will be better to have 3 separate loops 
with three identifiable names, and have each work in distinctly 
different ways.  That simplifies, teaching, using, and reading the 
resulting code. IMHO.

    1.  For-loops: Fast efficient list iteration. No changes.

    2.  While-loops: Fast efficient truth test based loop. No changes.

    3.  Do-loops: An generator based loop with finalization:  This could 
be both single and multiple pass.  The difference is determined by 
weather or not the generator used loops the yield statement or not.


I think a good test is the retry example in the PEP.  A solution that 
can represent that clearly and concisely would be a good choice.

Maybe this could be made to work:

def auto_retry(n, exc):
     while n>0:
         try:
             yield True
             n = 0
         except exc:
             n -= 1

do auto_retry(3, IOError):
     f = urllib.urlopen("http://python.org/")
     print f.read()

The ability to propagate the exception back to the generator is what's 
important here.

The while version of this nearly works, but is missing the exception 
propagation back to the generator, the ability to pass back through the 
yield, and finalization if the outside while loop is broken before the 
generator finishes.

def auto_retry(n, exc):
     while n>1:
         try:
             yield True
             break
         except exc:
             n -= 1
     # finalize here
     yield None

import urllib
ar = auto_retry(3, IOError)
while ar.next():
     f = urllib.urlopen("http://python.org/")
     print f.read()

Although changing 'while' shouldn't be done. I think using 'do' for 
generator based loops would be good.

This isn't that different from PEP340 I think.  Maybe it's just comming 
to the same conclusion from a differnt perspective. <shrug> :-)

Cheers, Ron





From eric.nieuwland at xs4all.nl  Sun May  8 21:39:09 2005
From: eric.nieuwland at xs4all.nl (Eric Nieuwland)
Date: Sun, 8 May 2005 21:39:09 +0200
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <20050508093239.64FD.JCARLSON@uci.edu>
References: <20050507140547.64F0.JCARLSON@uci.edu>
	<62c9d603ada1357bd7739d5a39f9946a@xs4all.nl>
	<20050508093239.64FD.JCARLSON@uci.edu>
Message-ID: <0b4428d924a9ea5e6209db4f1745ff93@xs4all.nl>

Josiah Carlson wrote:
> Eric Nieuwland <eric.nieuwland at xs4all.nl> wrote:
>> I suggested to create AN ITERATOR FOR THE LIST and destroy that at the
>> end. The list itself remains untouched.
>
> My mistake, I did not understand your use of pronouns.

And, rereading my post, I used an ambigous reference.
My bad as well.

--eric


From eric.nieuwland at xs4all.nl  Sun May  8 21:56:01 2005
From: eric.nieuwland at xs4all.nl (Eric Nieuwland)
Date: Sun, 8 May 2005 21:56:01 +0200
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <20050508090344.64FA.JCARLSON@uci.edu>
References: <20050508011954.64F7.JCARLSON@uci.edu>
	<427E295C.3070708@ronadam.com>
	<20050508090344.64FA.JCARLSON@uci.edu>
Message-ID: <7b38962e2a0fde925b932e366f24ab1f@xs4all.nl>

Josiah Carlson wrote:
> The argument over whether blocks should loop, I believe has been had;
> they should.  The various use cases involve multi-part transactions and
> such.

Then it is not so much looping but more pushing forward the state of 
the state of the block's life-cycle?
This might by a good moment to consider life-cycle support a la PROCOL.

--eric


From mwh at python.net  Sun May  8 21:33:33 2005
From: mwh at python.net (Michael Hudson)
Date: Sun, 08 May 2005 20:33:33 +0100
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <20050508043201.15422.945259482.divmod.quotient.25206@ohm> (Jp
	Calderone's message of "Sun, 08 May 2005 04:32:01 GMT")
References: <20050508043201.15422.945259482.divmod.quotient.25206@ohm>
Message-ID: <2moeblv7g2.fsf@starship.python.net>

Jp Calderone <exarkun at divmod.com> writes:

>   If such a construct is to be introduced, the ideal spelling would seem to be:
>
>     for [VAR in] EXPR:
>         BLOCK1
>     finally:
>         BLOCK2

Does this mean that adding 

    finally:
        pass

to a for block would make the for loop behave differently?

Cheers,
mwh

-- 
  I really hope there's a catastrophic bug in some future e-mail
  program where if you try and send an attachment it cancels your
  ISP account, deletes your harddrive, and pisses in your coffee
                                                         -- Adam Rixey

From ncoghlan at gmail.com  Sun May  8 22:54:56 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 09 May 2005 06:54:56 +1000
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <20050508090344.64FA.JCARLSON@uci.edu>
References: <20050508011954.64F7.JCARLSON@uci.edu>	<427E295C.3070708@ronadam.com>
	<20050508090344.64FA.JCARLSON@uci.edu>
Message-ID: <427E7CA0.7080308@gmail.com>

Josiah Carlson wrote:
> The argument over whether blocks should loop, I believe has been had;
> they should.  The various use cases involve multi-part transactions and
> such.

The number of good use cases for a looping block statement currently stands at 
exactly 1 (auto_retry). Every other use case suggested (locking, opening, 
suppressing, etc) involves factoring out try statement boiler plate that is far 
easier to comprehend with a single pass user defined statement. A single pass 
user defined statement allows all such code to be factored safely, even if the 
main clause of the try statement uses break or continue statements.

The insanity of an inherently looping block statement is shown by the massive 
semantic differences between the following two pieces of code under PEP 340:

   block locking(the_lock):
       for item in items:
           if handle(item):
               break

   for item in items:
       block locking(the_lock):
           if handle(item):
               break

With a non-looping user defined statement, you get the semantics you would 
expect for the latter case (i.e. the for loop is still terminated after an item 
is handled, whereas that won't happen under PEP 340)

For the one good use case for a user defined loop (auto_retry), I initially 
suggested in my redraft that there be a way of denoting that a given for loop 
gives the iterator the opportunity to intercept exceptions raised in the body of 
the loop (like the PEP 340 block statement). You convinced me that was a bad 
idea, and I switched to a simple iterator finalisation clause in version 1.2.

Even with that simplified approach though, *using* auto_retry is still very easy:

   for attempt in auto_retry(3, IOError):
       stmt attempt:
           do_something()

It's a little trickier to write auto_retry itself, since you can't easily use a 
generator anymore, but it still isn't that hard, and the separation of concerns 
(between iteration, and the customised control flow in response to exceptions) 
makes it very easy to grasp how it works.

>>>The closest thing to a generic solution I can come
>>>up with would be to allow for the labeling of for/while loops, and the
>>>allowing of "break/continue <label>", which continues to that loop
>>>(breaking all other loops currently nested within), or breaks that loop
>>>(as well as all other loops currently nested within).

Or, we simply have user defined statements which are not themselves loops, and 
use them to create named blocks:

   def block(name):
       try:
           yield
       except TerminateBlock, ex:
           if not ex.args or ex.args[0] != name
               raise

stmt block('foo'):
     while condition():
         stmt block('goo'):
             for ... in ...:
                 while other_case():
                     stmt block('hoo'):
                         if ...:
                             # Continue the inner while loop
                             continue
                         if ...:
                             # Exit the inner while loop
                             raise TerminateBlock, 'hoo'
                         if ...:
                             # Exit the for loop
                             raise TerminateBlock, 'goo'
                         # Exit the outer while loop
                         raise TerminateBlock, 'foo'

This has the benefit that an arbitrary block of code can be named, and a named 
TerminateBlock used to exit it.

> That is a mechanism, but I like it even less than the one I offered. 
> Every time that one wants ot offer themselves the ability to break out
> of a different loop (no continue here), one must create another
> try/except clause, further indenting, and causing nontrivial try/except
> overhead inside nested loops.

Ah well, that criticism applies to my suggestion, too. However, I suspect any 
such implementation is going to need to use exceptions for the guts of the flow 
control, even if that use isn't visible to the programmer.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From rrr at ronadam.com  Mon May  9 03:54:44 2005
From: rrr at ronadam.com (Ron Adam)
Date: Sun, 08 May 2005 21:54:44 -0400
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <20050508090344.64FA.JCARLSON@uci.edu>
References: <20050508011954.64F7.JCARLSON@uci.edu>
	<427E295C.3070708@ronadam.com>
	<20050508090344.64FA.JCARLSON@uci.edu>
Message-ID: <427EC2E4.8020205@ronadam.com>

Josiah Carlson wrote:

> Ron Adam <rrr at ronadam.com> wrote:

>>I should have said  "...should not finalize at the end of the for loop". 
>>  With generators, you may not want them to finalize before you are done 
>>with them, and the same with class's.
> 
> 
> So you don't use them with a structure that greedily finalizes, and you
> keep a reference to the object exterior to the loop.  Seems to be a
> non-issue.

Yes, it should be a non issue.


> The argument over whether blocks should loop, I believe has been had;
> they should.  The various use cases involve multi-part transactions and
> such.

I think so now too, I had thought as Nick does earlier this week that 
the non-looping version was cleaner, but changed my mind when I realized 
that looping blocks could be made to work for those in a simple and 
understandable way.

>>try:
>>     for x in range(100):
>>         for y in range(100):
>>             for z in range(100):
>>             	if x == 25 and y==72 and z==3:
>>                     raise BreakLoop
>>
>>except BreakLoop: pass
>>print 'x,y,z =', x,y,z

> That is a mechanism, but I like it even less than the one I offered. 
> Every time that one wants ot offer themselves the ability to break out
> of a different loop (no continue here), one must create another
> try/except clause, further indenting, and causing nontrivial try/except
> overhead inside nested loops.
> 
> A real solution to the problem should (in my opinion) allow the breaking
> of or continuing to an arbitrary for/while/block.  Humorously enough,
> Richie Hindle's goto/comefrom statements for Python ("not to be used in
> production code") would allow 90% of the necessary behavior (though the
> lack of timely finalization would probably annoy some people, but then
> again, there is only so much one can expect from a module written as a
> working April Fools joke over a year ago).
> 
>  - Josiah

I think maybe another alternative is a break buffer or cue. Where you 
push a 'break' onto the buffer and then execute a 'break' to break the 
current loop, The 'break' in the buffer then breaks the next loop out as 
soon as the current loop exits, etc.

for x in range(100):
     for y in range(100):
         for z in range(100):
            if x == 25 and y==72 and z==3:
               push_loop(Break,Break)  # will break both parent loops
               break                   # break current loop

if push_loop(...) could take a NoBreak, then you can selectively break 
outer breaks by how you sequence them.

push_break(None, break) wound not break the y loop, but will break the x 
loop above.  Can you think of a use case for something like that?


Pushing 'Continues' might also work:

for x in range(100):
     for y in range(100):
         if x == 25 and y==72:
             push_loop(Continue)  # will skip rest of parents body
             break                # break current loop
         #code2
     #code1

This will break the 'y' loop and skip code2, then continue the 'x' loop 
skipping code block 1.

Using a stack for breaks and continues isn't too different than using a 
stack for exceptions I think.

Also by making it a function call instead of a command you can have a 
function return a Break or Continue object, or None,

for x in range(100):
    for y in range(100):
       if y == testval:
           push_loop(looptest(x,y)):    break x loop depending on x,y
           break

None's returned would need to be discarded I think for this to work, so 
something else would be needed to skip a level.

It needs some polish I think.  ;-)

Cheers,
Ron_Adam



From martin at v.loewis.de  Mon May  9 06:59:59 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Mon, 09 May 2005 06:59:59 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <2bd4e2b88b94f48297ff0dcaef97a7c0@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
	<a39457493d9660bee8d1ace89067c990@opnet.com>
	<427C07C5.7060106@v.loewis.de>
	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>
	<427CC2C2.60000@v.loewis.de>
	<a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>
	<427DD8C1.4060109@v.loewis.de>
	<2bd4e2b88b94f48297ff0dcaef97a7c0@opnet.com>
Message-ID: <427EEE4F.40405@v.loewis.de>

Nicholas Bastin wrote:
>> Changing the documentation that goes along with the option
>> would be fine.
> 
> 
> That is exactly what I proposed originally, which you shot down.  Please
> actually read the contents of my messages.  What I said was "change the
> configure option and related documentation".

What I mean is "change just the documentation, do not change the
configure option". This seems to be different from your proposal,
which I understand as "change both the configure option and the
documentation".

> Wow, what an inane way of looking at it.  I don't know what world you
> live in, but in my world, users read the configure options and suppose
> that they mean something.  In fact, they *have* to go off on their own
> to assume something, because even the documentation you refer to above
> doesn't say what happens if they choose UCS-2 or UCS-4.  A logical
> assumption would be that python would use those CEFs internally, and
> that would be incorrect.

Certainly. That's why the documentation should be improved. Changing
the option breaks existing packaging systems, and should not be done
lightly.

> The current implementation supports the UTF-16 CEF.  i.e., it supports a
> variable width encoding form capable of representing all of the unicode
> space using surrogate pairs.  Please point out a code point that the
> current 2 byte implementation does not support, either directly, or
> through the use of surrogate pairs.

Try to match regular expression classes for non-BMP characters:

>>> re.match(u"[\u1234]",u"\u1234").group()
u'\u1234'

works fine, but

>>> re.match(u"[\U00011234]",u"\U00011234").group()
u'\ud804'

gives strange results.

Regards,
Martin

From martin at v.loewis.de  Mon May  9 07:06:42 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Mon, 09 May 2005 07:06:42 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <739269ab69ce041ca53dade76ad563f9@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427C07C5.7060106@v.loewis.de>	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>	<427CC2C2.60000@v.loewis.de>	<a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>	<427D2E92.9020303@egenix.com>
	<f9c6d597427a414ee75e2cd3b683d09a@opnet.com>
	<427DDBA1.6070503@v.loewis.de>
	<739269ab69ce041ca53dade76ad563f9@opnet.com>
Message-ID: <427EEFE2.9040807@v.loewis.de>

Nicholas Bastin wrote:
> It's not always 2 bytes on Windows.  Users can alter the config options
> (and not unreasonably so, btw, on 64-bit windows platforms).

Did you try that? I'm not sure it even builds when you do so, but if it
does, you will lose the "mbcs" codec, and the ability to use Unicode
strings as file names. Without the "mbcs" codec, I would expect that
quite a lot of the Unicode stuff breaks.

> You can't build a binary extension module on windows and
> assume that Py_UNICODE is 2 bytes, because that's not enforced in any
> way.  The same is true for 4-byte Py_UNICODE on RHL9.

Depends on how much force you want to see. That the official pydotorg
Windows installer python24.dll uses a 2-byte Unicode, and that a lot
of things break if you change Py_UNICODE to four bytes on Windows
(including PythonWin) is a pretty strong guarantee that you won't
see a Windows Python build with UCS-4 for quite some time.

Regards,
Martin

From michele.simionato at gmail.com  Mon May  9 07:08:54 2005
From: michele.simionato at gmail.com (Michele Simionato)
Date: Mon, 9 May 2005 01:08:54 -0400
Subject: [Python-Dev] The decorator module
In-Reply-To: <5.1.1.6.0.20050506123920.020644e0@mail.telecommunity.com>
References: <fb6fbf560505060730789906e2@mail.gmail.com>
	<4edc17eb0505060741635ecde8@mail.gmail.com>
	<ca471dc205050607553b4bafed@mail.gmail.com>
	<5.1.1.6.0.20050506123920.020644e0@mail.telecommunity.com>
Message-ID: <4edc17eb05050822084bb6a34c@mail.gmail.com>

On 5/6/05, Phillip J. Eby <pje at telecommunity.com> wrote:
> In this case, the informally-discussed proposal is to add a mutable
> __signature__ to functions, and have it be used by inspect.getargspec(), so
> that decorators can copy __signature__ from the decoratee to the decorated
> function.

Is there in the plans any facility to copy functions? Currently I am doing

def copyfunc(func):
    "Creates an independent copy of a function."
    c = func.func_code
    nc = new.code(c.co_argcount, c.co_nlocals, c.co_stacksize, c.co_flags,
                  c.co_code, c.co_consts, c.co_names, c.co_varnames,
                  c.co_filename, c.co_name, c.co_firstlineno,
                  c.co_lnotab, c.co_freevars, c.co_cellvars)
    return new.function(nc, func.func_globals, func.func_name,
                        func.func_defaults, func.func_closure)
 
and I *hate* it!

I have updated my module to version 0.2, with an improved discussion
of decorators in multithreaded programming ("locked", "threaded",
"deferred"): http://www.phyast.pitt.edu/~micheles/python/decorator.zip


        Michele Simionato

From martin at v.loewis.de  Mon May  9 07:12:24 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Mon, 09 May 2005 07:12:24 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <bea778d4690a6e73455f7908fa009c8c@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>	<427BF842.8060604@hathawaymix.org>
	<427C089F.2010808@v.loewis.de>	<427C76D6.4050409@hathawaymix.org>
	<427CC3E3.4090405@v.loewis.de> <427D024B.6080207@hathawaymix.org>
	<427D0BDB.6050802@egenix.com> <427E3D63.2070301@hathawaymix.org>
	<427E5013.5020308@v.loewis.de>
	<bea778d4690a6e73455f7908fa009c8c@opnet.com>
Message-ID: <427EF138.8060607@v.loewis.de>

Nicholas Bastin wrote:
>> Again, patches are welcome. I was opposed to Nick's proposed changes,
>> since they explicitly said that you are not supposed to know what
>> is in a Py_UNICODE. Integrating the essence of PEP 261 into the
>> main documentation would be a worthwhile task.
> 
> 
> You can't possibly assume you know specifically what's in a Py_UNICODE
> in any given python installation.  If someone thinks this statement is
> untrue, please explain why.

This is a different issue. Between saying "we don't know what
installation xyz uses" and saying "we cannot say anything" is a wide
range of things that you can truthfully say. Like "it can be either
two bytes or four bytes" (but not one or three bytes), and so on.

Also, for a given installation, you can find out by looking at
sys.maxunicode from Python, or at Py_UNICODE_SIZE from C.

> I realize you might not *want* that to be true, but it is.  Users are
> free to configure their python however they desire, and if that means
> --enable-unicode=ucs2 on RH9, then that is perfectly valid.

Sure they can. Of course, that will mean they don't get a working
_tkinter, unless they rebuild Tcl as well. Nevertheless, it is indeed
likely that people do that. So if you want to support them, you
need to distribute two versions of your binary module, or give
them source code.

Regards,
Martin

From jcarlson at uci.edu  Mon May  9 07:56:49 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Sun, 08 May 2005 22:56:49 -0700
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <427E6746.6070707@ronadam.com>
References: <427DF60C.9020402@gmail.com> <427E6746.6070707@ronadam.com>
Message-ID: <20050508225019.6506.JCARLSON@uci.edu>


Ron Adam <rrr at ronadam.com> wrote:
> There's also the possibility to use conditional looping based on the 
> value returned from the generator.
> 
> do VAR from EXPR if VAR==CONST:
>     BLOCK
> 
> This is a bit verbose, but it reads well. :-)

Reading well or not, this is not really an option for the same reasons
why...

  for VAR in EXPR1 if EXPR2:
or
  for VAR in EXPR1 while EXPR2:

are not options.  Keep it simple.


>     3.  Do-loops: An generator based loop with finalization:  This could 
> be both single and multiple pass.  The difference is determined by 
> weather or not the generator used loops the yield statement or not.

Offering only generator-based finalization loops is, as I understand it,
not an option.


 - Josiah


From jcarlson at uci.edu  Mon May  9 08:31:05 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Sun, 08 May 2005 23:31:05 -0700
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <427E7CA0.7080308@gmail.com>
References: <20050508090344.64FA.JCARLSON@uci.edu> <427E7CA0.7080308@gmail.com>
Message-ID: <20050508230253.6509.JCARLSON@uci.edu>


Nick Coghlan <ncoghlan at gmail.com> wrote:
> Josiah Carlson wrote:
> > The argument over whether blocks should loop, I believe has been had;
> > they should.  The various use cases involve multi-part transactions and
> > such.

[snip looping block discussion]

> For the one good use case for a user defined loop (auto_retry), I initially 
> suggested in my redraft that there be a way of denoting that a given for loop 
> gives the iterator the opportunity to intercept exceptions raised in the body of 
> the loop (like the PEP 340 block statement). You convinced me that was a bad 
> idea, and I switched to a simple iterator finalisation clause in version 1.2.

Well then, I guess you have re-convinced me that the block statement
probably shouldn't loop.

> Even with that simplified approach though, *using* auto_retry is still very easy:
> 
>    for attempt in auto_retry(3, IOError):
>        stmt attempt:
>            do_something()
> 
> It's a little trickier to write auto_retry itself, since you can't easily use a 
> generator anymore, but it still isn't that hard, and the separation of concerns 
> (between iteration, and the customised control flow in response to exceptions) 
> makes it very easy to grasp how it works.

Great.  Now all we need is a module with a handful of finalization
generators, with all of the obvious ones already implemented.


> >>>The closest thing to a generic solution I can come
> >>>up with would be to allow for the labeling of for/while loops, and the
> >>>allowing of "break/continue <label>", which continues to that loop
> >>>(breaking all other loops currently nested within), or breaks that loop
> >>>(as well as all other loops currently nested within).
> 
> Or, we simply have user defined statements which are not themselves loops, and 
> use them to create named blocks:

[snipped code to protect the innocent]

> This has the benefit that an arbitrary block of code can be named, and a named 
> TerminateBlock used to exit it.

Scary.


> > That is a mechanism, but I like it even less than the one I offered. 
> > Every time that one wants ot offer themselves the ability to break out
> > of a different loop (no continue here), one must create another
> > try/except clause, further indenting, and causing nontrivial try/except
> > overhead inside nested loops.
> 
> Ah well, that criticism applies to my suggestion, too. However, I suspect any 
> such implementation is going to need to use exceptions for the guts of the flow 
> control, even if that use isn't visible to the programmer.

Not necessarily.  If I were implementing such a thing; any time
arbitrary break/continues (to a loop that isn't the deepest) were used
in nested loops, I would increment a counter any time a loop was entered,
and decrement the counter any time a loop was exited.  When performing a
break/continue, I would merely set another variable for which loop is
the final break/continue, then the interpreter could break loops while
the desired level/current level differed, then perform a final
break/continue depending on what was executed.

No exceptions necessary, and the increment/decrement should necessarily
be cheap (an increment/decrement of a char, being that Python limits
itself to 20 nested fors, and probably should limit itself to X nested
loops, where X < 256).


 - Josiah


From python at rcn.com  Mon May  9 08:34:38 2005
From: python at rcn.com (Raymond Hettinger)
Date: Mon, 9 May 2005 02:34:38 -0400
Subject: [Python-Dev] The decorator module
In-Reply-To: <4edc17eb05050822084bb6a34c@mail.gmail.com>
Message-ID: <000a01c55461$2cfd8420$affecc97@oemcomputer>

[Michele Simionato]
> Is there in the plans any facility to copy functions? Currently I am
doing
> 
> def copyfunc(func):
>     "Creates an independent copy of a function."
>     c = func.func_code
>     nc = new.code(c.co_argcount, c.co_nlocals, c.co_stacksize,
c.co_flags,
>                   c.co_code, c.co_consts, c.co_names, c.co_varnames,
>                   c.co_filename, c.co_name, c.co_firstlineno,
>                   c.co_lnotab, c.co_freevars, c.co_cellvars)
>     return new.function(nc, func.func_globals, func.func_name,
>                         func.func_defaults, func.func_closure)
> 
> and I *hate* it!

Sounds reasonable.

Choices:
- submit a patch adding a __copy__ method to functions,
- submit a patch for the copy module, or
- submit a feature request, assign to me, and wait.


Raymond Hettinger

From greg.ewing at canterbury.ac.nz  Mon May  9 08:28:59 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 09 May 2005 18:28:59 +1200
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
References: <20050507140547.64F0.JCARLSON@uci.edu>
	<427D446B.3080707@ronadam.com> <20050507210108.64F4.JCARLSON@uci.edu>
	<427DB731.8060604@ronadam.com>
Message-ID: <427F032B.2010402@canterbury.ac.nz>

Ron Adam wrote:
> There seems to be some confusion as to weather or 
> not 'for's will do finalizing.  So I was trying to stress I think 
> regular 'for' loops should not finalize. They should probably give an 
> error if an object with an try-finally in them or an __exit__ method. 

But if the for-loop can tell whether the iterator
needs finalizing or not, why not have it finalize
the ones that need it and not finalize the ones
that don't? That would be backwards compatible,
since old for-loops working on old iterators would
work as before.

Greg



From jcarlson at uci.edu  Mon May  9 08:40:46 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Sun, 08 May 2005 23:40:46 -0700
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <427EC2E4.8020205@ronadam.com>
References: <20050508090344.64FA.JCARLSON@uci.edu>
	<427EC2E4.8020205@ronadam.com>
Message-ID: <20050508224845.6503.JCARLSON@uci.edu>


Ron Adam <rrr at ronadam.com> wrote:
> > The argument over whether blocks should loop, I believe has been had;
> > they should.  The various use cases involve multi-part transactions and
> > such.
> 
> I think so now too, I had thought as Nick does earlier this week that 
> the non-looping version was cleaner, but changed my mind when I realized 
> that looping blocks could be made to work for those in a simple and 
> understandable way.

I wasn't expressing my opinion, I was attempting to express as to where
the discussion went and concluded.  I honestly can't remember having an
opinion on the subject, but I seem to have convinced Nick earlier that
they shouldn't loop, and he (re-)convinced me that indeed, they
shouldn't loop.


> I think maybe another alternative is a break buffer or cue. Where you 
> push a 'break' onto the buffer and then execute a 'break' to break the 
> current loop, The 'break' in the buffer then breaks the next loop out as 
> soon as the current loop exits, etc.

[snip]

> It needs some polish I think.  ;-)

Goodness, the horror!  When implementation details start bleeding their
way into actual language constructs (using a continue/break stack in
order to control the flow of nested loops), that's a good clue that an
idea has gone a bit too far.

I would honestly prefer gotos, and I would prefer having no change to
existing syntax to gaining gotos.


It's kind of funny.  Every month I spend in python-dev, I feel less
inclined to want to change the Python language (except for the relative
import I need to finish implementing).  Not because it is a pain in the
tookus (though it is), but because many times it is my immediate sense
of aesthetics that causes me to desire change, and my future of code
maintenance makes me think forward to understanding Python 2.3 in the
context of Python 2.9 .


 - Josiah


From greg.ewing at canterbury.ac.nz  Mon May  9 09:13:09 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 09 May 2005 19:13:09 +1200
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
References: <427D92A8.7040508@gmail.com>
	<20050508043201.15422.945259482.divmod.quotient.25206@ohm>
	<79990c6b0505080454335d413@mail.gmail.com> <427E0442.4040707@gmail.com>
Message-ID: <427F0D85.60301@canterbury.ac.nz>

Nick Coghlan wrote:

> "Loop on this iterator and finalise when done" would be written:
> 
>    for item in itr:
>        process(item)
>    finally:
>        pass

This is approaching things from the wrong end. The user of
an iterator shouldn't need to know or care whether it
requires finalization -- it should Just Work, whatever
context it is used in.

Greg


From ncoghlan at gmail.com  Mon May  9 14:46:54 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 09 May 2005 22:46:54 +1000
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <427F032B.2010402@canterbury.ac.nz>
References: <20050507140547.64F0.JCARLSON@uci.edu>	<427D446B.3080707@ronadam.com>
	<20050507210108.64F4.JCARLSON@uci.edu>	<427DB731.8060604@ronadam.com>
	<427F032B.2010402@canterbury.ac.nz>
Message-ID: <427F5BBE.3050403@gmail.com>

Greg Ewing wrote:
> Ron Adam wrote:
> 
>>There seems to be some confusion as to weather or 
>>not 'for's will do finalizing.  So I was trying to stress I think 
>>regular 'for' loops should not finalize. They should probably give an 
>>error if an object with an try-finally in them or an __exit__ method. 
> 
> 
> But if the for-loop can tell whether the iterator
> needs finalizing or not, why not have it finalize
> the ones that need it and not finalize the ones
> that don't? That would be backwards compatible,
> since old for-loops working on old iterators would
> work as before.

When I first started redrafting the PEP, I had essentially this idea in there - 
look for an __exit__() method, if it's there use the new 'finalising' semantics, 
if it isn't, use the old semantics.

This bloats the generated byte code horribly, though - it is necessary to 
produce two complete copies of the for loop code, since we don't know at compile 
time which version (finalising or non-finalising) will be needed.

It also takes away from the programmer the ability to choose to do partial 
iteration on generators that require finalisation. And it does so in a 
non-obvious way: "it works with this iterator, why doesn't it work with that one?"

Accordingly, I switched to a version which puts control pack in the hands of the 
programmer. If you know the iterator doesn't need finalising, or if eventual 
finalisation on garbage collection is sufficient, then you can omit the finally 
clause, and get the optimised form of the for loop. Alternatively, if you want 
prompt finalisation (e.g. with an iterator like all_lines() from the PEP 
redraft), then you can include the finally clause and get the behaviour you want.

Failure to finalise promptly on iterators that need it is still a bug - but 
then, so is failing to close a file handle or database connection.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From jimjjewett at gmail.com  Mon May  9 16:52:54 2005
From: jimjjewett at gmail.com (Jim Jewett)
Date: Mon, 9 May 2005 10:52:54 -0400
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
Message-ID: <fb6fbf56050509075253beb195@mail.gmail.com>

Nick Coghlan <ncoghlan at gmail.com> wrote:
> Josiah Carlson wrote:

>> This has the benefit that an arbitrary block of code can be named, 
>> and a named TerminateBlock used to exit it.

>> ... I suspect any such implementation is going to need to use
>> exceptions for the guts of the flow control, even if that use isn't
>> visible to the programmer.

> Not necessarily.  If I were implementing such a thing; any time
> arbitrary break/continues (to a loop that isn't the deepest) were used
> in nested loops, I would increment a counter any time a loop was entered,
> and decrement the counter any time a loop was exited.  ...

When named blocks are used in Lisp, they often cross function
boundaries.  Given that, the number of intervening loops could
change depending on external variables.  Since you would have
to pop frames anyhow, Exceptions are the right way to do it.

If you limited the named-block gotos to within a single function/method,
then the loop counter would work (and you could limit obfuscation).
Unfortunately, you would lose most of the power of named blocks, 
while still paying the full ugliness price.  You would also encourage 
people to inline things that ought to be separate functions.

In case it isn't clear, I think named loops would be a mistake.  I
wanted them when I first started, but ... at the moment, I can't
think of any usage that wasn't an ugly speed hack, which is at
least more explicit with the "raise Found" idiom.

-jJ

From jimjjewett at gmail.com  Mon May  9 17:07:36 2005
From: jimjjewett at gmail.com (Jim Jewett)
Date: Mon, 9 May 2005 11:07:36 -0400
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
Message-ID: <fb6fbf5605050908075f8becd8@mail.gmail.com>

Nick Coghlan wrote:
> "Loop on this iterator and finalise when done" would be written:

>    for item in itr:
>        process(item)
>    finally:
>        pass

Greg Ewing wrote:

> This is approaching things from the wrong end. The user of
> an iterator shouldn't need to know or care whether it
> requires finalization -- it should Just Work, whatever
> context it is used in.

If you're saying "lists don't need it, but openfiles do", then I agree;
it shouldn't matter what type of iterator you have.

If you're talking specific object instances, then the user is the 
only one (outside of the Garbage Collection system) who has a 
chance of knowing whether the rest of the iterator will be needed
later.

When iterating over lines in a file, and breaking out at a sentinel,
the compiler can't know whether you're done, or just leaving the
"real" lines to another piece of code.

Of course, that still raises the "Why are we encouraging bugs?" issue.

If there are no remaining references, then garbage collection is the
answer, and maybe we just need to make it more aggressive.  If
there are remaining references, then maybe the user is wrong about
being done.

-jJ

From mal at egenix.com  Mon May  9 18:19:02 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Mon, 09 May 2005 18:19:02 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427DD377.6040401@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>	<427BF842.8060604@hathawaymix.org>	<427C089F.2010808@v.loewis.de>	<427C76D6.4050409@hathawaymix.org>	<427CC3E3.4090405@v.loewis.de>
	<427D024B.6080207@hathawaymix.org>	<427D0BDB.6050802@egenix.com>
	<427DD377.6040401@v.loewis.de>
Message-ID: <427F8D76.2060204@egenix.com>

Martin v. L?wis wrote:
> M.-A. Lemburg wrote:
> 
>>Unicode has many code points that are meant only for composition
>>and don't have any standalone meaning, e.g. a combining acute
>>accent (U+0301), yet they are perfectly valid code points -
>>regardless of UCS-2 or UCS-4. It is easily possible to break
>>such a combining sequence using slicing, so the most
>>often presented argument for using UCS-4 instead of UCS-2
>>(+ surrogates) is rather weak if seen by daylight.
> 
> 
> I disagree. It is not just about slicing, it is also about
> searching for a character (either through the "in" operator,
> or through regular expressions). If you define an SRE character
> class, such a character class cannot hold a non-BMP character
> in UTF-16 mode, but it can in UCS-4 mode. Consequently,
> implementing XML's lexical classes (such as Name, NCName, etc.)
> is much easier in UCS-4 than it is in UCS-2. In this case,
> combining characters do not matter much, because the XML
> spec is defined in terms of Unicode coded characters, causing
> combining characters to appear as separate entities for lexical
> purposes (unlike half surrogates).

Searching for a character is possible in UCS2 as well -
even for surrogates with "in" now supporting multiple
code point searches:

>>> len(u'\U00010000')
2
>>> u'\U00010000' in u'\U00010001\U00010002\U00010000 and some extra stuff'
True
>>> u'\U00010000' in u'\U00010001\U00010002\U00010003 and some extra stuff'
False

On sre character classes: I don't think that these provide
a good approach to XML lexical classes - custom functions
or methods or maybe even a codec mapping the characters
to their XML lexical class are much more efficient in
practice.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 09 2005)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From pje at telecommunity.com  Mon May  9 18:31:43 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Mon, 09 May 2005 12:31:43 -0400
Subject: [Python-Dev] The decorator module
In-Reply-To: <4edc17eb05050822084bb6a34c@mail.gmail.com>
References: <5.1.1.6.0.20050506123920.020644e0@mail.telecommunity.com>
	<fb6fbf560505060730789906e2@mail.gmail.com>
	<4edc17eb0505060741635ecde8@mail.gmail.com>
	<ca471dc205050607553b4bafed@mail.gmail.com>
	<5.1.1.6.0.20050506123920.020644e0@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050509123057.021a3650@mail.telecommunity.com>

At 01:08 AM 5/9/2005 -0400, Michele Simionato wrote:
>On 5/6/05, Phillip J. Eby <pje at telecommunity.com> wrote:
> > In this case, the informally-discussed proposal is to add a mutable
> > __signature__ to functions, and have it be used by inspect.getargspec(), so
> > that decorators can copy __signature__ from the decoratee to the decorated
> > function.
>
>Is there in the plans any facility to copy functions? Currently I am doing
>
>def copyfunc(func):
>     "Creates an independent copy of a function."
>     c = func.func_code
>     nc = new.code(c.co_argcount, c.co_nlocals, c.co_stacksize, c.co_flags,
>                   c.co_code, c.co_consts, c.co_names, c.co_varnames,
>                   c.co_filename, c.co_name, c.co_firstlineno,
>                   c.co_lnotab, c.co_freevars, c.co_cellvars)
>     return new.function(nc, func.func_globals, func.func_name,
>                         func.func_defaults, func.func_closure)
>
>and I *hate* it!

You don't need to copy the code object; functions can share the same code 
object just fine, and in fact for closures they do all the time.


From mal at egenix.com  Mon May  9 18:29:59 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Mon, 09 May 2005 18:29:59 +0200
Subject: [Python-Dev] Python's Unicode width default (New Py_UNICODE doc)
In-Reply-To: <427DD4EF.4030109@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>	<427BCD3B.1000201@egenix.com>	<427C029D.3090907@v.loewis.de>
	<427D0E80.4080502@egenix.com> <427DD4EF.4030109@v.loewis.de>
Message-ID: <427F9007.3070603@egenix.com>

[Python used to always default to UCS2-Unicode builds;
 this was changed to default to whatever a possibly installed
 TCL system is using - hiding the choice from the user
 and in effect removing the notion of having a Python
 Unicode default configuration]

Martin v. L?wis wrote:
> M.-A. Lemburg wrote:
> 
>>I believe that it would be more appropriate to adjust the _tkinter
>>module to adapt to the TCL Unicode size rather than
>>forcing the complete Python system to adapt to TCL - I don't
>>really see the point in an optional extension module
>>defining the default for the interpreter core.
> 
> 
> _tkinter currently supports, for a UCS-2 Tcl, both UCS-2 and UCS-4
> Python. For an UCS-4 Tcl, it requires Python also to be UCS-4.
> Contributions to support the missing case are welcome.

I'm no expert for _tkinter and don't use it, so I'm
the wrong one to ask :-)

However, making Python's own default depend on some
3rd party software on the machines is bad design.

>>At the very least, this should be a user controlled option.
> 
> 
> It is: by passing --enable-unicode=ucs2, you can force Python
> to use UCS-2 even if Tcl is UCS-4, with the result that
> _tkinter cannot be built anymore (and compilation fails
> with an #error).

I think we should remove the defaulting to whatever
TCL uses and instead warn the user about a possible
problem in case TCL is found and uses a Unicode
width which is incompatible with Python's choice.

The user can then decide whether she finds
_tkinter important enough to turn away from the standard
Python default Unicode width or not (with all the
consequences that go with it, e.g. memory bloat,
problems installing binaries precompiled for standard
Python builds, etc.). This should definitely *not* be
done automatically.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 09 2005)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From rrr at ronadam.com  Mon May  9 20:11:44 2005
From: rrr at ronadam.com (Ron Adam)
Date: Mon, 09 May 2005 14:11:44 -0400
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <20050508225019.6506.JCARLSON@uci.edu>
References: <427DF60C.9020402@gmail.com> <427E6746.6070707@ronadam.com>
	<20050508225019.6506.JCARLSON@uci.edu>
Message-ID: <427FA7E0.9020001@ronadam.com>

Josiah Carlson wrote:

 > Ron Adam <rrr at ronadam.com> wrote:
 >
 >> There's also the possibility to use conditional looping based on the 
value returned from the generator.
 >>
 >> do VAR from EXPR if VAR==CONST:
 >>    BLOCK
 >>
 >> This is a bit verbose, but it reads well. :-)
 >
 >
 >
 > Reading well or not, this is not really an option for the same reasons
 > why...
 >
 >   for VAR in EXPR1 if EXPR2:
 > or
 >   for VAR in EXPR1 while EXPR2:
 >
 > are not options.  Keep it simple.
 >

Yes, so just "do [VAR from] EXPR1:"

 >>    3.  Do-loops: An generator based loop with finalization:  This 
could be both single and multiple pass.  The difference is determined by 
weather or not the generator used loops the yield statement or not.
 >
 >
 > Offering only generator-based finalization loops is, as I understand it,
 > not an option.


It could also include class's with __exit__ methods which are really 
just more complex generators when used this way.

But isn't this what PEP340 *already* proposes?  Or am I missing a subtle 
distinction here.

-Ron



From eric.nieuwland at xs4all.nl  Mon May  9 20:21:15 2005
From: eric.nieuwland at xs4all.nl (Eric Nieuwland)
Date: Mon, 9 May 2005 20:21:15 +0200
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <427F032B.2010402@canterbury.ac.nz>
References: <20050507140547.64F0.JCARLSON@uci.edu>
	<427D446B.3080707@ronadam.com>
	<20050507210108.64F4.JCARLSON@uci.edu>
	<427DB731.8060604@ronadam.com> <427F032B.2010402@canterbury.ac.nz>
Message-ID: <48bb6ec262246bddbb0da57da3621c23@xs4all.nl>

Greg Ewing wrote:
> Ron Adam wrote:
>> There seems to be some confusion as to weather or
>> not 'for's will do finalizing.  So I was trying to stress I think
>> regular 'for' loops should not finalize. They should probably give an
>> error if an object with an try-finally in them or an __exit__ method.
>
> But if the for-loop can tell whether the iterator
> needs finalizing or not, why not have it finalize
> the ones that need it and not finalize the ones
> that don't? That would be backwards compatible,
> since old for-loops working on old iterators would
> work as before.

That's why I suggested to have the behaviour depend on what is passed 
in as EXPR.

for VAR in EXPR:
	BLOCK

could be translated to:

__cleanup = False
__itr = EXPR
if not isinstance(__itr,iterator):
	__itr = iter(__itr)
	__cleanup = True
while True:
	try:
		VAR = __itr.next()
	except StopIteration:
		break
	BLOCK
if __cleanup:
	__itr.__exit__()

Which would require isinstance(__itr,iterator) or equivalent to act as 
a robust test on iterators.
I'll leave 'for' with an 'else' clause as an exercise to the reader.

--eric


From rrr at ronadam.com  Mon May  9 20:27:01 2005
From: rrr at ronadam.com (Ron Adam)
Date: Mon, 09 May 2005 14:27:01 -0400
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <427F032B.2010402@canterbury.ac.nz>
References: <20050507140547.64F0.JCARLSON@uci.edu>	<427D446B.3080707@ronadam.com>
	<20050507210108.64F4.JCARLSON@uci.edu>	<427DB731.8060604@ronadam.com>
	<427F032B.2010402@canterbury.ac.nz>
Message-ID: <427FAB75.1030209@ronadam.com>

Greg Ewing wrote:

> Ron Adam wrote:
> 
>>There seems to be some confusion as to weather or 
>>not 'for's will do finalizing.  So I was trying to stress I think 
>>regular 'for' loops should not finalize. They should probably give an 
>>error if an object with an try-finally in them or an __exit__ method. 
> 
> 
> But if the for-loop can tell whether the iterator
> needs finalizing or not, why not have it finalize
> the ones that need it and not finalize the ones
> that don't? That would be backwards compatible,
> since old for-loops working on old iterators would
> work as before.
> 
> Greg

Ok, so if they check for it, they might as well handle it.  Or maybe 
they shouldn't even check for performance reasons?  Then no error, and 
it's up to the programmer to decide which looping construct to use.

_Ron











From rrr at ronadam.com  Mon May  9 21:25:28 2005
From: rrr at ronadam.com (Ron Adam)
Date: Mon, 09 May 2005 15:25:28 -0400
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <20050508224845.6503.JCARLSON@uci.edu>
References: <20050508090344.64FA.JCARLSON@uci.edu>
	<427EC2E4.8020205@ronadam.com>
	<20050508224845.6503.JCARLSON@uci.edu>
Message-ID: <427FB928.5020809@ronadam.com>

Josiah Carlson wrote:

> I wasn't expressing my opinion, I was attempting to express as to where
> the discussion went and concluded.  I honestly can't remember having an
> opinion on the subject, but I seem to have convinced Nick earlier that
> they shouldn't loop, and he (re-)convinced me that indeed, they
> shouldn't loop.

So you can be re-re-convinced if more use cases are found? ;-)


>>It needs some polish I think.  ;-)
> 
> 
> Goodness, the horror!  When implementation details start bleeding their
> way into actual language constructs (using a continue/break stack in
> order to control the flow of nested loops), that's a good clue that an
> idea has gone a bit too far.
> 
> I would honestly prefer gotos, and I would prefer having no change to
> existing syntax to gaining gotos.

Well I happen to like stacks for somethings. They can be very useful and 
efficient.  But I agree that they have been horrible abused in all sorts 
of ways.  But that doesn't make them bad.  In this case I think they 
work well, but the implementation ccould be improved.

I see a correlation between breaks, continues, and exceptions.  So it 
makes since to me that they could use similar mechanisms to catch and 
reraise them as needed.

But this seems awkward and it's even *uglier*!

     try:
        # some code in a loop.
     except BreakException, Breakme:
        pass

then later....

     raise Breakme    # breaks a loop now instead of then.


But these examples sort of point out an underlying concept in these 
discussions.  That it is useful to be able to postpone or delay code to 
be executed at a later time.  Thus the idioms, before-after, enter-exit, 
and init-finalize.  And here again,  with try-finally, and breaks/continues.

_Ron


> It's kind of funny.  Every month I spend in python-dev, I feel less
> inclined to want to change the Python language (except for the relative
> import I need to finish implementing).  Not because it is a pain in the
> tookus (though it is), but because many times it is my immediate sense
> of aesthetics that causes me to desire change, and my future of code
> maintenance makes me think forward to understanding Python 2.3 in the
> context of Python 2.9 .

Changing it just for the sake of newness or "it's cool" or whatever 
isn't good of course.  But extending a language vs changing the way it 
works, is worth while as long as it's done in a careful and consistent 
manner.  That's sort of why I'm against changing 'for', and for adding 
the new loop/block.  I see it as a loop with a higher level of 
abstraction. A new tool to be used in new ways, but I want to keep my 
old dependable tools too.


_Ron

The simplest computer language in the world is called MAIN, has only one 
function called main(), which evaluates any arbitrary set of arguments 
you give it.  Programming is now an obsolete profession!

However we are now in desperate need of experts who can understand 
exceedingly long arbitrary sets of arguments and what they will do when 
they are evaluated by MAIN's main() function.

;-)



From jcarlson at uci.edu  Mon May  9 21:24:06 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Mon, 09 May 2005 12:24:06 -0700
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <fb6fbf56050509075253beb195@mail.gmail.com>
References: <fb6fbf56050509075253beb195@mail.gmail.com>
Message-ID: <20050509094657.650C.JCARLSON@uci.edu>


Jim Jewett <jimjjewett at gmail.com> wrote:
> 
> Nick Coghlan <ncoghlan at gmail.com> wrote:
> > Josiah Carlson wrote:
> 
> >> This has the benefit that an arbitrary block of code can be named, 
> >> and a named TerminateBlock used to exit it.
> 
> >> ... I suspect any such implementation is going to need to use
> >> exceptions for the guts of the flow control, even if that use isn't
> >> visible to the programmer.
> 
> > Not necessarily.  If I were implementing such a thing; any time
> > arbitrary break/continues (to a loop that isn't the deepest) were used
> > in nested loops, I would increment a counter any time a loop was entered,
> > and decrement the counter any time a loop was exited.  ...
> 
> When named blocks are used in Lisp, they often cross function
> boundaries.  Given that, the number of intervening loops could
> change depending on external variables.  Since you would have
> to pop frames anyhow, Exceptions are the right way to do it.

I wasn't talking about cross-function blocks/named blocks.  I was
strictly talking about nested loops as they currently exist in Python.


> If you limited the named-block gotos to within a single function/method,
> then the loop counter would work (and you could limit obfuscation).
> Unfortunately, you would lose most of the power of named blocks, 
> while still paying the full ugliness price.

That's fine, I don't want named loops or blocks anyhow.  I was merely
offering an implementation that did not require exceptions, and was
necessarily fast (proving both that it could be fast and not require
exceptions).


> You would also encourage 
> people to inline things that ought to be separate functions.

I wouldn't go that far.  If one were to introduce such functionality, it
would be to ease control flow within nested for/while/blocks.  Whether
or not that lead to people inlining code, who are we to say?  It would,
however, complicate the 'inline function' decorator that I seem to have
lost my link to.


> In case it isn't clear, I think named loops would be a mistake.  I
> wanted them when I first started, but ... at the moment, I can't
> think of any usage that wasn't an ugly speed hack, which is at
> least more explicit with the "raise Found" idiom.

Don't get me wrong, I think they would be a mistake as well, but they
would solve the 'does a break statement in a block break its enclosing
loop' question, as well as general nested loop flow control issues.  Now
that we both agree that they shouldn't be done, maybe one of us should
write a PEP for Guido to rule on so that we never have to hear about
loop naming (heh).


 - Josiah


From jcarlson at uci.edu  Mon May  9 21:27:09 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Mon, 09 May 2005 12:27:09 -0700
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <427FA7E0.9020001@ronadam.com>
References: <20050508225019.6506.JCARLSON@uci.edu>
	<427FA7E0.9020001@ronadam.com>
Message-ID: <20050509120844.650F.JCARLSON@uci.edu>


Ron Adam <rrr at ronadam.com> wrote:
> 
> Josiah Carlson wrote:
> 
>  > Ron Adam <rrr at ronadam.com> wrote:
>  >
>  >> There's also the possibility to use conditional looping based on the 
> value returned from the generator.
>  >>
>  >> do VAR from EXPR if VAR==CONST:
>  >>    BLOCK
>  >>
>  >> This is a bit verbose, but it reads well. :-)
>  >
>  >
>  >
>  > Reading well or not, this is not really an option for the same reasons
>  > why...
>  >
>  >   for VAR in EXPR1 if EXPR2:
>  > or
>  >   for VAR in EXPR1 while EXPR2:
>  >
>  > are not options.  Keep it simple.
>  >
> 
> Yes, so just "do [VAR from] EXPR1:"

Regardless of the 'finalization' syntax, I'm talking about the fact that
including extra 'if EXPR' or 'while EXPR' is not going to be an option.

>  >>    3.  Do-loops: An generator based loop with finalization:  This 
> could be both single and multiple pass.  The difference is determined by 
> weather or not the generator used loops the yield statement or not.
>  >
>  >
>  > Offering only generator-based finalization loops is, as I understand it,
>  > not an option.
> 
> It could also include classes with __exit__ methods which are really 
> just more complex generators when used this way.
> 
> But isn't this what PEP340 *already* proposes?  Or am I missing a subtle 
> distinction here.

It is, in fact, what PEP 340 already proposes.  Let us take a step back
for a moment and realize that this entire discussion is going around in
circles.

From what I understand, we all agree:
1. There should be some mechanism X which signals that an indented suite
is a 'block statement'.  Such blocks are described and finalized as per
PEP 340 (or whatever derivative gets accepted).

2. Standard for/while loops should not be finalized in a timely fashion,
because testing for the proper methods would necessarily slow down large
amounts of current Python, so should be left to the garbage collector.

3. A note as to the additional overhead of finalized blocks should be
mentioned in the source code of the finalization implementation, and a
note on their performace characteristics may or may not need to be
listed in the language reference.

What there is still discussion over:
4. What the syntax should be.
5. Whether or not it loops.
6. Method names.


I find the answers to 4,5,6 to be a matter of opinion, and like many, I
have my own.  However, I do not feel strongly enough about 4,5,6 to
argue about my opinion (I've been attempting to re-state where the
conversation went for the last 2 weeks, as I have managed to read each
and every email about the subject at hand *ick*).

 - Josiah


From rrr at ronadam.com  Mon May  9 22:21:24 2005
From: rrr at ronadam.com (Ron Adam)
Date: Mon, 09 May 2005 16:21:24 -0400
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <20050509120844.650F.JCARLSON@uci.edu>
References: <20050508225019.6506.JCARLSON@uci.edu>
	<427FA7E0.9020001@ronadam.com>
	<20050509120844.650F.JCARLSON@uci.edu>
Message-ID: <427FC644.4080207@ronadam.com>

Josiah Carlson wrote:

>> Ron Adam <rrr at ronadam.com> wrote:

>>Yes, so just "do [VAR from] EXPR1:"
> 
> Regardless of the 'finalization' syntax, I'm talking about the fact that
> including extra 'if EXPR' or 'while EXPR' is not going to be an option. 

Yes, I meant for the syntax to be the shorter form, not for the 
programmer to just leave off the end.

>>But isn't this what PEP340 *already* proposes?  Or am I missing a subtle 
>>distinction here.
> 
> It is, in fact, what PEP 340 already proposes.  Let us take a step back
> for a moment and realize that this entire discussion is going around in
> circles.

I think so too.

> From what I understand, we all agree:
> 1. There should be some mechanism X which signals that an indented suite
> is a 'block statement'.  Such blocks are described and finalized as per
> PEP 340 (or whatever derivative gets accepted).

+1

> 2. Standard for/while loops should not be finalized in a timely fashion,
> because testing for the proper methods would necessarily slow down large
> amounts of current Python, so should be left to the garbage collector.

+1

Also add to this, it is not always desirable to finalize an object after 
use in a for loop.

> 3. A note as to the additional overhead of finalized blocks should be
> mentioned in the source code of the finalization implementation, and a
> note on their performace characteristics may or may not need to be
> listed in the language reference.

+1

> What there is still discussion over:
> 4. What the syntax should be.
> 5. Whether or not it loops.
> 6. Method names.
> 
> I find the answers to 4,5,6 to be a matter of opinion, and like many, I
> have my own.  However, I do not feel strongly enough about 4,5,6 to
> argue about my opinion (I've been attempting to re-state where the
> conversation went for the last 2 weeks, as I have managed to read each
> and every email about the subject at hand *ick*).
> 
>  - Josiah

I think you clarified this well.

Item 4: A list of possible syntax's with a vote at some point should do.

Item 5:

    (A.) More use case's for looping blocks need to be found.  I think 
there may be some or many that are not obvious at the moment.

    (B.) It may not cost much in performance to include the looping 
behavior.  Maybe this should be put off till there is a working version 
of each, then comparisons of performance can be made in different 
situations?

_Ron



From martin at v.loewis.de  Mon May  9 23:30:54 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Mon, 09 May 2005 23:30:54 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427F8D76.2060204@egenix.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>	<427BF842.8060604@hathawaymix.org>	<427C089F.2010808@v.loewis.de>	<427C76D6.4050409@hathawaymix.org>	<427CC3E3.4090405@v.loewis.de>
	<427D024B.6080207@hathawaymix.org>	<427D0BDB.6050802@egenix.com>
	<427DD377.6040401@v.loewis.de> <427F8D76.2060204@egenix.com>
Message-ID: <427FD68E.6020400@v.loewis.de>

M.-A. Lemburg wrote:
> On sre character classes: I don't think that these provide
> a good approach to XML lexical classes - custom functions
> or methods or maybe even a codec mapping the characters
> to their XML lexical class are much more efficient in
> practice.

That isn't my experience: functions that scan XML strings
are much slower than regular expressions. I can't imagine
how a custom codec could work, so I cannot comment on that.

Regards,
Martin


From martin at v.loewis.de  Mon May  9 23:32:08 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Mon, 09 May 2005 23:32:08 +0200
Subject: [Python-Dev] Python's Unicode width default (New Py_UNICODE doc)
In-Reply-To: <427F9007.3070603@egenix.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>	<427BCD3B.1000201@egenix.com>	<427C029D.3090907@v.loewis.de>
	<427D0E80.4080502@egenix.com> <427DD4EF.4030109@v.loewis.de>
	<427F9007.3070603@egenix.com>
Message-ID: <427FD6D8.2010003@v.loewis.de>

M.-A. Lemburg wrote:
> I think we should remove the defaulting to whatever
> TCL uses and instead warn the user about a possible
> problem in case TCL is found and uses a Unicode
> width which is incompatible with Python's choice.

-1.

Regards,
Martin

From ncoghlan at gmail.com  Mon May  9 23:51:03 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 10 May 2005 07:51:03 +1000
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <000101c55419$4e26ac00$76b0958d@oemcomputer>
References: <000101c55419$4e26ac00$76b0958d@oemcomputer>
Message-ID: <427FDB47.80608@gmail.com>

Raymond Hettinger wrote:
> [Nick Coghlan]
> 
>>The number of good use cases for a looping block statement currently
>>stands at
>>exactly 1 (auto_retry). Every other use case suggested (locking,
> 
> opening,
> 
>>suppressing, etc) involves factoring out try statement boiler plate
> 
> that
> 
>>is far
>>easier to comprehend with a single pass user defined statement. 
> 
> 
> I would like to offer up one additional use case, eliminating redundant
> code in a do-while construct:
> 
> 
> def do_while(cond_func):
>     yield
>     while cond_func():
>         yield
> 
> block do_while(lambda: a>b):
>     <sequence of actions affecting a and b>

Nice example, but it doesn't need to intercept exceptions the way auto_retry 
does. Accordingly, a 'for' loop which makes [VAR in] optional would do the job 
just fine:

   for do_while(lambda: a>b):
       <do it>

Even today, the following works:

   def do_while(cond_func):
       yield None
       while cond_func():
           yield None

   for _ in do_while(lambda: a>b):
       <do it>

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From jcarlson at uci.edu  Tue May 10 00:25:00 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Mon, 09 May 2005 15:25:00 -0700
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <427FB928.5020809@ronadam.com>
References: <20050508224845.6503.JCARLSON@uci.edu>
	<427FB928.5020809@ronadam.com>
Message-ID: <20050509135140.6515.JCARLSON@uci.edu>


Ron Adam <rrr at ronadam.com> wrote:
> Josiah Carlson wrote:
> 
> > I wasn't expressing my opinion, I was attempting to express as to where
> > the discussion went and concluded.  I honestly can't remember having an
> > opinion on the subject, but I seem to have convinced Nick earlier that
> > they shouldn't loop, and he (re-)convinced me that indeed, they
> > shouldn't loop.
> 
> So you can be re-re-convinced if more use cases are found? ;-)

Historically, my opinion has meant close to 0 in regards to language
features, so even if I were to change my mind again, I doubt it would
have any effect on the outcome of this feature.

> >>It needs some polish I think.  ;-)
> > 
> > 
> > Goodness, the horror!  When implementation details start bleeding their
> > way into actual language constructs (using a continue/break stack in
> > order to control the flow of nested loops), that's a good clue that an
> > idea has gone a bit too far.
> > 
> > I would honestly prefer gotos, and I would prefer having no change to
> > existing syntax to gaining gotos.
> 
> Well I happen to like stacks for somethings. They can be very useful and 
> efficient.  But I agree that they have been horrible abused in all sorts 
> of ways.  But that doesn't make them bad.  In this case I think they 
> work well, but the implementation ccould be improved.

You missed my point.  As an implementation detail, stacks are fine.  As
discussion about how one uses it, discussion of stacks is wholly out of
the question.

Think of it in terms of function calls.  Do we talk about 'call stacks'
when we make function calls?  Of course not, and call stacks are a major
part about how Python is implemented.  This tells us that we certainly
shouldn't talk about 'loop stacks' when we are talking about nested
loops.


> But this seems awkward and it's even *uglier*!
> 
>      try:
>         # some code in a loop.
>      except BreakException, Breakme:
>         pass
> 
> then later....
> 
>      raise Breakme    # breaks a loop now instead of then.

That's why we don't see many (if any) uses right now, even if it does
solve the nested loop control flow 'problem'.


> But these examples sort of point out an underlying concept in these 
> discussions.  That it is useful to be able to postpone or delay code to 
> be executed at a later time.  Thus the idioms, before-after, enter-exit, 
> and init-finalize.  And here again,  with try-finally, and breaks/continues.

Apple, Apple, Apple, Apple, hotdog.  One of those don't quite fit, and
that's break/continue.  Break/continue statements are about control flow,
and while the other mechanisms can be used for control flow, that is not
their primary purpose.  The first three are about resource
acquisition/release and try/finally is about making sure that a block of
code is executed "no matter what" (the first three using try/finally as
a mechanism to guarantee resource release, if acquisition has taken
place).

Let us look at the flow of the conversation...
"There should be a resource acquisition/release statement with a code
suite attached."
... lead to ...
"The behavior of a break within such a code suite is ambiguous."
... lead to ...
"Flow within nested loops themselves can be tricky and/or ambiguous."

It's not that there is an underlying concept, it's just that the third
bit of the conversation hadn't been brought up, but 'solving' that bit
would unambiguate the previous 'break within code block' ambiguity. 
It's a sledge hammer solution to a jewel hammer problem.


> > It's kind of funny.  Every month I spend in python-dev, I feel less
> > inclined to want to change the Python language (except for the relative
> > import I need to finish implementing).  Not because it is a pain in the
> > tookus (though it is), but because many times it is my immediate sense
> > of aesthetics that causes me to desire change, and my future of code
> > maintenance makes me think forward to understanding Python 2.3 in the
> > context of Python 2.9 .
> 
> Changing it just for the sake of newness or "it's cool" or whatever 
> isn't good of course.

In general, I was thinking "that would make my life as a programmer
working with Python easier", but within a few days I realize that
Python's power is in its simplicity.  With every new language
ability/feature, learning and using the language (necessarily) becomes
more difficult.


> But extending a language vs changing the way it 
> works, is worth while as long as it's done in a careful and consistent 
> manner.  That's sort of why I'm against changing 'for', and for adding 
> the new loop/block.  I see it as a loop with a higher level of 
> abstraction. A new tool to be used in new ways, but I want to keep my 
> old dependable tools too.

I'm not sure that the discussion has been leading toward a change that
is 'careful and consistent', but more of people offering their pet
desired syntaxes (I've been having dejavu to the decorator discussion of
last spring/summer).  Of course the only person you need to convince is
Guido, and he's already all but signed off on the entirety of PEP 340; we
seem to be within syntax musings right now.


 - Josiah


From greg.ewing at canterbury.ac.nz  Tue May 10 04:45:44 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Tue, 10 May 2005 14:45:44 +1200
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <427FB928.5020809@ronadam.com>
References: <20050508090344.64FA.JCARLSON@uci.edu>
	<427EC2E4.8020205@ronadam.com> <20050508224845.6503.JCARLSON@uci.edu>
	<427FB928.5020809@ronadam.com>
Message-ID: <42802058.2070806@canterbury.ac.nz>

Ron Adam wrote:

> That's sort of why I'm against changing 'for', and for adding 
> the new loop/block.  I see it as a loop with a higher level of 
> abstraction. A new tool to be used in new ways, but I want to keep my 
> old dependable tools too.

But if there's too much overlap in functionality
between the old and the new tool, you're in danger
of losing TOOWTDI.

Random thought for the day:

Programming tools are different from physical tools.
I own quite a few different screwdrivers, several
of which would be more or less equally good for
any particular screw, and this isn't a problem.
But I don't have a big crowd of people looking
over my shoulder while I work, all trying to figure
out why I chose one particular screwdriver over
another, and decide which would be the best
screwdriver to use on their screws.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Tue May 10 04:50:22 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Tue, 10 May 2005 14:50:22 +1200
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <20050509120844.650F.JCARLSON@uci.edu>
References: <20050508225019.6506.JCARLSON@uci.edu>
	<427FA7E0.9020001@ronadam.com> <20050509120844.650F.JCARLSON@uci.edu>
Message-ID: <4280216E.7010304@canterbury.ac.nz>

Josiah Carlson wrote:

> 2. Standard for/while loops should not be finalized in a timely fashion,
> because testing for the proper methods would necessarily slow down large
> amounts of current Python, so should be left to the garbage collector.

I'm not convinced about that. If implemented properly,
it could be as simple as testing whether a slot of a
type object is populated during processing of the
bytecode which causes exit from the loop.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Tue May 10 05:10:59 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Tue, 10 May 2005 15:10:59 +1200
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <427F5BBE.3050403@gmail.com>
References: <20050507140547.64F0.JCARLSON@uci.edu>
	<427D446B.3080707@ronadam.com> <20050507210108.64F4.JCARLSON@uci.edu>
	<427DB731.8060604@ronadam.com> <427F032B.2010402@canterbury.ac.nz>
	<427F5BBE.3050403@gmail.com>
Message-ID: <42802643.6060705@canterbury.ac.nz>

Nick Coghlan wrote:

> This bloats the generated byte code horribly, though - it is necessary to 
> produce two complete copies of the for loop code, since we don't know at compile 
> time which version (finalising or non-finalising) will be needed.

Unless I'm seriously mistaken, all the Python-equivalent
loop code that's been presented is only for expositional
purposes -- in real life, the logic would be embedded in
the ceval code that implements the for-loop control
bytecodes, so there would be little or no difference in
the bytecode from what is generated today.

> It also takes away from the programmer the ability to choose to do partial 
> iteration on generators that require finalisation.
> 
> Accordingly, I switched to a version which puts control pack in the hands of the 
> programmer.

I still think it puts far too much burden on the user,
though. The vast majority of the time, for-loops are
intended to consume their iterators, and the user
may not even know what flavour of iterator is being
used, much less want to have to think about it. This
means that nearly *every* for-loop would need to have
a finally tacked on the end of it as a matter of
course.

It would be better to do it the other way around, and
have a different form of looping statement for when
you *don't* want finalization. The programmer knows he's
doing a partial iteration when he writes the code,
and is therefore in a position to choose the right
statement.

For backwards compatibility, the existing for-loop
would work for partial iteration of old iterators,
but this usage would be deprecated.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Tue May 10 05:11:01 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Tue, 10 May 2005 15:11:01 +1200
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <fb6fbf56050509075253beb195@mail.gmail.com>
References: <fb6fbf56050509075253beb195@mail.gmail.com>
Message-ID: <42802645.1000204@canterbury.ac.nz>

Jim Jewett wrote:

> In case it isn't clear, I think named loops would be a mistake.  I
> wanted them when I first started, but ... at the moment, I can't
> think of any usage that wasn't an ugly speed hack, which is at
> least more explicit with the "raise Found" idiom.

I'm inclined to agree. Anything more elaborate than
breaking from a single place in the immediately
enclosing loop tends to be getting into the realm
of spaghetti, in my experience. Giving people named
loops would be tantamount almost to giving them
a goto.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Tue May 10 05:11:04 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Tue, 10 May 2005 15:11:04 +1200
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <fb6fbf5605050908075f8becd8@mail.gmail.com>
References: <fb6fbf5605050908075f8becd8@mail.gmail.com>
Message-ID: <42802648.1090305@canterbury.ac.nz>

Jim Jewett wrote:

> If you're talking specific object instances, then the user is the 
> only one (outside of the Garbage Collection system) who has a 
> chance of knowing whether the rest of the iterator will be needed
> later.

Indeed. He *does* know whether he will want the iterator
again later. He *doesn't* know whether it will require
finalization when he is eventually done with it, and
failing to do so if needed will cause obscure bugs.
Also, not needing it again is the overwhelmingly commoner
use case.

My conclusion is that finalization should be the default,
with a way of explicitly overriding it when necessary.

> If there are no remaining references, then garbage collection is the
> answer, and maybe we just need to make it more aggressive.  If
> there are remaining references, then maybe the user is wrong about
> being done.

Or maybe he's not wrong, and due to the way things are
coded, the reference happens to hang around a little
longer than strictly needed.

If garbage collection were sufficient, we'd be relying
on it to close our files in the first place, and this
whole thread would never have gotten started.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From gvanrossum at gmail.com  Tue May 10 06:58:14 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Mon, 9 May 2005 21:58:14 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
Message-ID: <ca471dc2050509215823876c50@mail.gmail.com>

Apologies if this has been discovered and rejected already; I've had
to skip most of the discussions but this though won't leave my head...

So PEP 310 proposes this:

        with VAR = EXPR:
            BLOCK

translated to

        VAR = EXPR\
	if hasattr(VAR, "__enter__"):
	    VAR.__enter__()
	try:
	    BLOCK
	finally:
            VAR.__exit__()

This equates VAR with the value of EXPR. It has a problem: what if
inside BLOCK an assignment to VAR is made -- does this affect the
finally clause or not? I think that the finally clause should use an
internal variable that isn't affected by assignments to VAR.

But what if we changed the translation slightly so that VAR gets
assigned to value of the __enter__() call:

        abc = EXPR
        VAR = abc.__enter__()      # I don't see why it should be optional
        try:
            BLOCK
        finally:
            abc.__exit__()

Now it would make more sense to change the syntax to

        with EXPR as VAR:
            BLOCK

and we have Phillip Eby's proposal. The advantage of this is that you
can write a relatively straightforward decorator, call it
@with_template, that endows a generator with the __enter__ and
__exit__ methods, so you can write all the examples (except
auto_retry(), which was there mostly to make a point) from PEP 340
like this:

        @with_template
        def opening(filename, mode="r"):
            f = open(filename, mode)
            yield f
            f.close()

and so on. (Note the absence of a try/finally block in the generator
-- the try/finally is guaranteed by the with-statement but not by the
generator framework.)

I used to dislike this, but the opposition and the proliferation of
alternative proposals have made me realize that I'd rather have this
(plus "continue EXPR" which will be moved to a separate PEP once I
have some extra time) than most of the other proposals.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Tue May 10 07:14:24 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Mon, 9 May 2005 22:14:24 -0700
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
In-Reply-To: <42802645.1000204@canterbury.ac.nz>
References: <fb6fbf56050509075253beb195@mail.gmail.com>
	<42802645.1000204@canterbury.ac.nz>
Message-ID: <ca471dc2050509221418e07684@mail.gmail.com>

[Jim Jewett]
> > In case it isn't clear, I think named loops would be a mistake.  I
> > wanted them when I first started, but ... at the moment, I can't
> > think of any usage that wasn't an ugly speed hack, which is at
> > least more explicit with the "raise Found" idiom.

[Greg Ewing]
> I'm inclined to agree. Anything more elaborate than
> breaking from a single place in the immediately
> enclosing loop tends to be getting into the realm
> of spaghetti, in my experience. Giving people named
> loops would be tantamount almost to giving them
> a goto.

Yes please. Stop all discussion of breaking out of multiple loops. It
ain't gonna happen before my retirement.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From mwh at python.net  Tue May 10 09:16:25 2005
From: mwh at python.net (Michael Hudson)
Date: Tue, 10 May 2005 08:16:25 +0100
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc2050509215823876c50@mail.gmail.com> (Guido van Rossum's
	message of "Mon, 9 May 2005 21:58:14 -0700")
References: <ca471dc2050509215823876c50@mail.gmail.com>
Message-ID: <2m8y2nv9di.fsf@starship.python.net>

Guido van Rossum <gvanrossum at gmail.com> writes:

> Apologies if this has been discovered and rejected already; I've had
> to skip most of the discussions but this though won't leave my head...
>
> So PEP 310 proposes this:
>
>         with VAR = EXPR:
>             BLOCK
>
> translated to
>
>         VAR = EXPR\
> 	if hasattr(VAR, "__enter__"):
> 	    VAR.__enter__()
> 	try:
> 	    BLOCK
> 	finally:
>             VAR.__exit__()
>
> This equates VAR with the value of EXPR. It has a problem: what if
> inside BLOCK an assignment to VAR is made -- does this affect the
> finally clause or not? I think that the finally clause should use an
> internal variable that isn't affected by assignments to VAR.

Uh, if that's not clear from the PEP (and I haven't looked) it's an
oversight.  VAR is optional in PEP 310, after all.

Cheers,
mwh

-- 
  There's an aura of unholy black magic about CLISP.  It works, but
  I have no idea how it does it.  I suspect there's a goat involved
  somewhere.                     -- Johann Hibschman, comp.lang.scheme

From mal at egenix.com  Tue May 10 11:07:01 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Tue, 10 May 2005 11:07:01 +0200
Subject: [Python-Dev] Python's Unicode width default (New Py_UNICODE doc)
In-Reply-To: <427FD6D8.2010003@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>	<427BCD3B.1000201@egenix.com>	<427C029D.3090907@v.loewis.de>	<427D0E80.4080502@egenix.com>
	<427DD4EF.4030109@v.loewis.de>	<427F9007.3070603@egenix.com>
	<427FD6D8.2010003@v.loewis.de>
Message-ID: <428079B5.6010602@egenix.com>

Martin v. L?wis wrote:
> M.-A. Lemburg wrote:
> 
>>I think we should remove the defaulting to whatever
>>TCL uses and instead warn the user about a possible
>>problem in case TCL is found and uses a Unicode
>>width which is incompatible with Python's choice.
> 
> -1.

Martin, please reconsider... the choice is between:

a) We have a cross-platform default Unicode width
   setting of UCS2.

b) The default Unicode width is undefined and the only
   thing we can tell the user is:

   Run the configure script and then try the interpreter
   to check whether you've got a UCS2 or UCS4 build.

Option b) is what the current build system implements
and causes problems since the binary interface of the
interpreter changes depending on the width of Py_UNICODE
making UCS2 and UCS4 builds incompatible.

I want to change the --enable-unicode switch back to
always use UCS2 as default and add a new option value
"tcl" which then triggers the behavior you've added to
support _tkinter, ie.

    --enable-unicode=tcl

bases the decision to use UCS2 or UCS4 on the installed
TCL interpreter (if there is one).

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 10 2005)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From ncoghlan at gmail.com  Tue May 10 11:18:36 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 10 May 2005 19:18:36 +1000
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc2050509215823876c50@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
Message-ID: <42807C6C.1070708@gmail.com>

Guido van Rossum wrote:
> Apologies if this has been discovered and rejected already; I've had
> to skip most of the discussions but this though won't leave my head...
> 
> So PEP 310 proposes this:
> 
>         with VAR = EXPR:
>             BLOCK
> 
> translated to
> 
>         VAR = EXPR\
> 	if hasattr(VAR, "__enter__"):
> 	    VAR.__enter__()
> 	try:
> 	    BLOCK
> 	finally:
>             VAR.__exit__()
> 
> I used to dislike this, but the opposition and the proliferation of
> alternative proposals have made me realize that I'd rather have this
> (plus "continue EXPR" which will be moved to a separate PEP once I
> have some extra time) than most of the other proposals.

The User Defined Statement section of my PEP redraft suggests something very 
similar to this:
http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

It suggests more complex semantics, so that the statement template has the 
chance to intercept exceptions raised in the block, and can tell the difference 
between normal termination and exiting the block via break, continue or return 
statements. This is needed to support some of the use cases (like the 
transaction() template). All of the PEP 340 examples are written up at the end 
of the PEP redraft, along with some of the motivating cases for a non-looping 
construct.

(Ignore the part in the redraft about for loops for the moment - Greg Ewing has 
convinced me that what I currently have gets the default behaviour backwards. 
And, in relation to that, the next version will require a decorator to enable 
__enter__() and __exit__() methods on a given generator).

Regards,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From mal at egenix.com  Tue May 10 11:19:20 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Tue, 10 May 2005 11:19:20 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427FD68E.6020400@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>	<427BF842.8060604@hathawaymix.org>	<427C089F.2010808@v.loewis.de>	<427C76D6.4050409@hathawaymix.org>	<427CC3E3.4090405@v.loewis.de>	<427D024B.6080207@hathawaymix.org>	<427D0BDB.6050802@egenix.com>	<427DD377.6040401@v.loewis.de>
	<427F8D76.2060204@egenix.com> <427FD68E.6020400@v.loewis.de>
Message-ID: <42807C98.3090208@egenix.com>

Martin v. L?wis wrote:
> M.-A. Lemburg wrote:
> 
>>On sre character classes: I don't think that these provide
>>a good approach to XML lexical classes - custom functions
>>or methods or maybe even a codec mapping the characters
>>to their XML lexical class are much more efficient in
>>practice.
> 
> 
> That isn't my experience: functions that scan XML strings
> are much slower than regular expressions.  I can't imagine
> how a custom codec could work, so I cannot comment on that.

If all you're interested in is the lexical class of the code points
in a string, you could use such a codec to map each code point
to a code point representing the lexical class. Then run re
as usual on the mapped Unicode string. Since the indices of
the matches found in the resulting string will be the same as
in the original string, it's easy to extract the corresponding
data from the original string.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 10 2005)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From ncoghlan at gmail.com  Tue May 10 11:39:40 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 10 May 2005 19:39:40 +1000
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <42802643.6060705@canterbury.ac.nz>
References: <20050507140547.64F0.JCARLSON@uci.edu>
	<427D446B.3080707@ronadam.com>
	<20050507210108.64F4.JCARLSON@uci.edu>
	<427DB731.8060604@ronadam.com> <427F032B.2010402@canterbury.ac.nz>
	<427F5BBE.3050403@gmail.com> <42802643.6060705@canterbury.ac.nz>
Message-ID: <4280815C.2050007@gmail.com>

Greg Ewing wrote:
> Unless I'm seriously mistaken, all the Python-equivalent
> loop code that's been presented is only for expositional
> purposes -- in real life, the logic would be embedded in
> the ceval code that implements the for-loop control
> bytecodes, so there would be little or no difference in
> the bytecode from what is generated today.

Hmm, that would obviously be more sensible. OK, I'll drop that from my list of 
concerns :)

> It would be better to do it the other way around, and
> have a different form of looping statement for when
> you *don't* want finalization. The programmer knows he's
> doing a partial iteration when he writes the code,
> and is therefore in a position to choose the right
> statement.
> 
> For backwards compatibility, the existing for-loop
> would work for partial iteration of old iterators,
> but this usage would be deprecated.

I actually agree, but the pain comes with generators. Doing it this way means 
that generators can't have an __exit__() method by default - it will need to be 
enabled with a decorator of some description (a future statement won't work, 
since it needs to be selectable on a generator by generator basis). It has to be 
done this way, so that old generators (without the decorator) are not 
inadvertently finalised by unmodified for loops (otherwise old code that isn't 
expecting finalisation could break severely).

Hmm, with that approach, a code inspection tool like pychecker could be used to 
pick up the slack, and flag generators which have a yield inside a try/finally 
or a user defined statement without applying the "needs finalisation" decorator 
(assuming the compiler can't detect this for itself).

OK, given the above, finalising by default definitely seems like the right thing 
to do - there's then the question of how to spell "don't finalise this iterator".

It turns out no keyword is needed for that. Instead, an iterator that iterates 
over a supplied iterable suffices:

   class partial_iter(object):
       def __init__(self, iterable):
           self.itr = iter(iterable)
       def __iter__(self):
           yield self
       def next(self):
           return self.itr.next()

Then, partial iteration over something that defines __exit__ is possible via:

   for item in partial_iter(itr):
         break
   print list(itr) # itr not finalised, even if it defines __exit__()

That reads reasonably well, and it should be possible to reduce the overhead on 
for loops to a single check of a method slot in the various opcodes that can 
exit a for loop.

So, this idea (or something like it) will go into my next draft.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From michele.simionato at gmail.com  Tue May 10 14:50:09 2005
From: michele.simionato at gmail.com (Michele Simionato)
Date: Tue, 10 May 2005 08:50:09 -0400
Subject: [Python-Dev] The decorator module
In-Reply-To: <000a01c55461$2cfd8420$affecc97@oemcomputer>
References: <4edc17eb05050822084bb6a34c@mail.gmail.com>
	<000a01c55461$2cfd8420$affecc97@oemcomputer>
Message-ID: <4edc17eb05051005502e061a04@mail.gmail.com>

On 5/9/05, Raymond Hettinger <python at rcn.com> wrote:
> 
> Choices:
> - submit a patch adding a __copy__ method to functions,
> - submit a patch for the copy module, or
> - submit a feature request, assign to me, and wait.

Well, actually I am even more ambitious than that: not only I would like
to be able to copy functions, but I also would like to be able to subclass
FunctionType with an user-defined __copy__ method.

Don't worry, I will submit the feature request ;)

              Michele Simionato 

P.S. I have added yet another example to the documentation of
the decorator module, now arrived at version 0.3:

http://www.phyast.pitt.edu/~micheles/python/decorator.zip

From michele.simionato at gmail.com  Tue May 10 15:15:17 2005
From: michele.simionato at gmail.com (Michele Simionato)
Date: Tue, 10 May 2005 09:15:17 -0400
Subject: [Python-Dev] The decorator module
In-Reply-To: <4edc17eb05051005502e061a04@mail.gmail.com>
References: <4edc17eb05050822084bb6a34c@mail.gmail.com>
	<000a01c55461$2cfd8420$affecc97@oemcomputer>
	<4edc17eb05051005502e061a04@mail.gmail.com>
Message-ID: <4edc17eb05051006155dfbd503@mail.gmail.com>

On 5/10/05, Michele Simionato <michele.simionato at gmail.com> wrote:
> 
> Well, actually I am even more ambitious than that: not only I would like
> to be able to copy functions, but I also would like to be able to subclass
> FunctionType with an user-defined __copy__ method.

BTW,  it seems possible to copy closures, but how about *pickling* them?
Is that technically feasible with a reasonable affort or is it a mess?

  Michele Simionato

From aleax at aleax.it  Tue May 10 16:57:43 2005
From: aleax at aleax.it (Alex Martelli)
Date: Tue, 10 May 2005 07:57:43 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc2050509215823876c50@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
Message-ID: <77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>


On May 9, 2005, at 21:58, Guido van Rossum wrote:

> Apologies if this has been discovered and rejected already; I've had
> to skip most of the discussions but this though won't leave my head...

Skimming rather than skipping all of the discussion burned most of my  
py-dev time, and it was just skimming, but I don't remember such  
rejections.

> But what if we changed the translation slightly so that VAR gets
> assigned to value of the __enter__() call:
>
>         abc = EXPR
>         VAR = abc.__enter__()      # I don't see why it should be  
> optional
>         try:
>             BLOCK
>         finally:
>             abc.__exit__()
>
> Now it would make more sense to change the syntax to
>
>         with EXPR as VAR:
>             BLOCK
>
> and we have Phillip Eby's proposal. The advantage of this is that you

I like this.  The only aspect of other proposals that I would sorely  
miss here, would be the inability for abc.__exit__ to deal with  
exceptions raised in BLOCK (or, even better, a separate specialmethod  
on abc called in lieu of __exit__ upon exceptions).  Or am I missing  
something, and would this give a way within abc.__exit__ to examine  
and possibly ``unraise'' such an exception...?

> can write a relatively straightforward decorator, call it
> @with_template, that endows a generator with the __enter__ and
> __exit__ methods, so you can write all the examples (except
> auto_retry(), which was there mostly to make a point) from PEP 340
> like this:
>
>         @with_template
>         def opening(filename, mode="r"):
>             f = open(filename, mode)
>             yield f
>             f.close()
>
> and so on. (Note the absence of a try/finally block in the generator
> -- the try/finally is guaranteed by the with-statement but not by the
> generator framework.)

I must be thick this morning, because this relatively straightforward  
decorator isn't immediately obvious to me -- care to show me how  
with_template gets coded?


Alex



From gvanrossum at gmail.com  Tue May 10 17:34:51 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 10 May 2005 08:34:51 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <42807C6C.1070708@gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<42807C6C.1070708@gmail.com>
Message-ID: <ca471dc205051008345d195787@mail.gmail.com>

[Nick]
> The User Defined Statement section of my PEP redraft suggests something very
> similar to this:
> http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
> 
> It suggests more complex semantics, so that the statement template has the
> chance to intercept exceptions raised in the block, and can tell the difference
> between normal termination and exiting the block via break, continue or return
> statements. This is needed to support some of the use cases (like the
> transaction() template). All of the PEP 340 examples are written up at the end
> of the PEP redraft, along with some of the motivating cases for a non-looping
> construct.

Is that use case strong enough to require the added complexity? For a
transactional wrapper, I can see that __exit__ needs to know about
exceptions (though I'm unsure how much detail it needs), but what's
the point of being able to tell an exception from a non-local goto
(which break, continue and return really are)? I could see the
following, minimal translation:

oke = False
abc = EXPR
var = abc.__enter__()
try:
    BLOCK
    oke = True
finally:
    abc.__exit__(oke)

What's your use case for giving __enter__ an opportunity to skip the
block altogether?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Tue May 10 17:49:17 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 10 May 2005 11:49:17 -0400
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<ca471dc2050509215823876c50@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>

At 07:57 AM 5/10/2005 -0700, Alex Martelli wrote:
>On May 9, 2005, at 21:58, Guido van Rossum wrote:
> > But what if we changed the translation slightly so that VAR gets
> > assigned to value of the __enter__() call:
> >
> >         abc = EXPR
> >         VAR = abc.__enter__()      # I don't see why it should be
> > optional
> >         try:
> >             BLOCK
> >         finally:
> >             abc.__exit__()
> >
> > Now it would make more sense to change the syntax to
> >
> >         with EXPR as VAR:
> >             BLOCK
> >
> > and we have Phillip Eby's proposal. The advantage of this is that you
>
>I like this.  The only aspect of other proposals that I would sorely
>miss here, would be the inability for abc.__exit__ to deal with
>exceptions raised in BLOCK (or, even better, a separate specialmethod
>on abc called in lieu of __exit__ upon exceptions).  Or am I missing
>something, and would this give a way within abc.__exit__ to examine
>and possibly ``unraise'' such an exception...?

Yeah, I'd ideally like to see __try__, __except__, __else__, and 
__finally__ methods, matching the respective semantics of those clauses in 
a try/except/finally block.


> > can write a relatively straightforward decorator, call it
> > @with_template, that endows a generator with the __enter__ and
> > __exit__ methods, so you can write all the examples (except
> > auto_retry(), which was there mostly to make a point) from PEP 340
> > like this:
> >
> >         @with_template
> >         def opening(filename, mode="r"):
> >             f = open(filename, mode)
> >             yield f
> >             f.close()
> >
> > and so on. (Note the absence of a try/finally block in the generator
> > -- the try/finally is guaranteed by the with-statement but not by the
> > generator framework.)
>
>I must be thick this morning, because this relatively straightforward
>decorator isn't immediately obvious to me -- care to show me how
>with_template gets coded?

Something like this, I guess:

     def with_template(f):
         class controller(object):
             def __init__(self,*args,**kw):
                 self.iter = f(*args,**kw)

             def __enter__(self):
                 return self.iter.next()

             def __exit__(self):
                 self.iter.next()
         return controller

But I'd rather see it with __try__/__except__ and passing exceptions into 
the generator so that the generator can use try/except/finally blocks to 
act on the control flow.


From gvanrossum at gmail.com  Tue May 10 17:47:06 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 10 May 2005 08:47:06 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
Message-ID: <ca471dc20505100847307a2e3b@mail.gmail.com>

[Alex]
> I like this.  The only aspect of other proposals that I would sorely
> miss here, would be the inability for abc.__exit__ to deal with
> exceptions raised in BLOCK (or, even better, a separate specialmethod
> on abc called in lieu of __exit__ upon exceptions).  Or am I missing
> something, and would this give a way within abc.__exit__ to examine
> and possibly ``unraise'' such an exception...?

See my followup to Nick. I'm not worried about unraising exceptions.
The only way to mess with the exception from code in a finally-suite
is to raise another exception, and we can't really prevent that.

However (I forgot this in the response to Nick) unless/until we
augment generators in some way the generator can't easily see the
exception flag.

[me]
> > can write a relatively straightforward decorator, call it
> > @with_template, that endows a generator with the __enter__ and
> > __exit__ methods, so you can write all the examples (except
> > auto_retry(), which was there mostly to make a point) from PEP 340
> > like this:
> >
> >         @with_template
> >         def opening(filename, mode="r"):
> >             f = open(filename, mode)
> >             yield f
> >             f.close()
> >
> > and so on. (Note the absence of a try/finally block in the generator
> > -- the try/finally is guaranteed by the with-statement but not by the
> > generator framework.)

[Alex]
> I must be thick this morning, because this relatively straightforward
> decorator isn't immediately obvious to me -- care to show me how
> with_template gets coded?

Here's a sketch:

class Wrapper(object):
    def __init__(self, gen):
        self.gen = gen
        self.state = "initial"
    def __enter__(self):
        assert self.state == "initial"
        self.state = "entered"
        try:
            return self.gen.next()
        except StopIteration:
            self.state = "error"
            raise RuntimeError("template generator didn't yield")
    def __exit__(self):
        assert self.state == "entered"
        self.state = "exited"
        try:
            self.gen.next()
        except StopIteration:
            return
        else:
            self.state = "error"
            raise RuntimeError("template generator didn't stop")

def with_template(func):
    def helper(*args, **kwds):
        return Wrapper(func(*args, **kwds))
    return helper

@with_template
def opening(filename, mode="r"):
    f = open(filename) # Note that IOError here is untouched by Wrapper
    yield f
    f.close() # Ditto for errors here (however unlikely)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From ncoghlan at gmail.com  Tue May 10 17:50:02 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 11 May 2005 01:50:02 +1000
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc2050509215823876c50@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
Message-ID: <4280D82A.4000709@gmail.com>

Guido van Rossum wrote:
> I used to dislike this, but the opposition and the proliferation of
> alternative proposals have made me realize that I'd rather have this
> (plus "continue EXPR" which will be moved to a separate PEP once I
> have some extra time) than most of the other proposals.

Draft 1.3 of my PEP 310/PEP 340 merger is now up for public consumption:
http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

This is a major rewrite since version 1.2. Significant changes are:

- reorder and reword things to emphasise the user defined statements, and their 
ability to factor out arbitrary try/except/else/finally boilerplate.
- use 'do' as the keyword instead of 'stmt' (I have to say, I *really* like the 
way 'do' looks and reads in all of the examples)
- clean up the semantics of user defined statements so as to make manual 
statement templates as easy to write as those in PEP 310
- give iterator finalisation its own slot, __finish__() (distinct from the 
__exit__() of statement templates)
- define sensible automatic finalisation semantics for iterative loops
- fill out the Rejected Options section meaningfully, with justifications for 
rejecting certain options
- makes the PEP more self-contained, with significantly fewer references to PEP 340.

These changes try to take into account the feedback I got on the previous 
drafts, as well as fixing a few problems I noticed myself.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From gvanrossum at gmail.com  Tue May 10 17:51:17 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 10 May 2005 08:51:17 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
Message-ID: <ca471dc205051008511e9e8d91@mail.gmail.com>

[Phillip J. Eby]
> Yeah, I'd ideally like to see __try__, __except__, __else__, and
> __finally__ methods, matching the respective semantics of those clauses in
> a try/except/finally block.

What's your use case for adding this complexity? I'm going for simple
here unless there's a really strong use case. Anyway, a wrapped
generator wrapper can't do with all those distinctions unless we
augment the generator somehow ("continue EXPR" would suffice).

(Your decorator is equivalent to mine, but I don't like creating a new
class each time.)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From ncoghlan at gmail.com  Tue May 10 18:17:43 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 11 May 2005 02:17:43 +1000
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc205051008345d195787@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>	
	<42807C6C.1070708@gmail.com>
	<ca471dc205051008345d195787@mail.gmail.com>
Message-ID: <4280DEA7.6070106@gmail.com>

Guido van Rossum wrote:
> [Nick]
> Is that use case strong enough to require the added complexity? For a
> transactional wrapper, I can see that __exit__ needs to know about
> exceptions (though I'm unsure how much detail it needs), but what's
> the point of being able to tell an exception from a non-local goto
> (which break, continue and return really are)?

The only real reason the statement template can tell the difference is because 
those non-local goto's all result in TerminateBlock being passed in as the 
exception (that's why the __exit__ method can't really tell the difference 
between those statements and the user code raising TerminateBlock, and also why 
TerminateBlock can't be suppressed by the statement template).

As far as use cases go, any case where we want the statement template to be able 
to manipulate the exception handling requires that this information be passed in 
to the __exit__() method. The current examples given in the PEP are 
transaction() and auto_retry(), but others have been suggested over the course 
of the discussion. One suggestion was for automatically logging exceptions, 
which requires access to the full state of the current exception.

I go into the motivation behind this a bit more in the updated draft I just 
posted (version 1.3). The basic idea is to allow factoring out of arbitrary 
try/except/else/finally code into a statement template, and use a 'do' statement 
to provide the contents of the 'try' clause.

If the exception information isn't passed in, then we can really only factor out 
try/finally statements, which is far less interesting.

> What's your use case for giving __enter__ an opportunity to skip the
> block altogether?

I realised that I don't have one - so the idea doesn't appear in the updated draft.

Regards,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From ncoghlan at gmail.com  Tue May 10 18:30:34 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 11 May 2005 02:30:34 +1000
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <4280D82A.4000709@gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<4280D82A.4000709@gmail.com>
Message-ID: <4280E1AA.4060704@gmail.com>

Nick Coghlan wrote:
> Draft 1.3 of my PEP 310/PEP 340 merger is now up for public consumption:
> http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

This draft was meant to drop the idea of __enter__() raising TerminateBlock 
preventing execution of the statement body. I dropped it out of the code 
describing the semantics, but the idea is still mentioned in the text.

I'll probably do another draft to fix that, and various ReST problems tomorrow 
night.

I'll also add in a justification for why I chose the single __exit__ method over 
separate __else__, __except__ and __finally__ methods.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From pje at telecommunity.com  Tue May 10 18:58:36 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 10 May 2005 12:58:36 -0400
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc205051008511e9e8d91@mail.gmail.com>
References: <5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>

At 08:51 AM 5/10/2005 -0700, Guido van Rossum wrote:
>[Phillip J. Eby]
> > Yeah, I'd ideally like to see __try__, __except__, __else__, and
> > __finally__ methods, matching the respective semantics of those clauses in
> > a try/except/finally block.
>
>What's your use case for adding this complexity?

It makes it much easier to mentally translate a given 
try/except/else/finally usage to a "resource" or "block controller" or 
whatever it is we're calling these things now.  You should almost be able 
to just stick 'def __' and '__(self):' and then munge the whole thing into 
a class.

Of course, it's not *really* that simple, because __try__ doesn't exactly 
correspond to 'try:', and it has to return something, but it sure is 
simpler than the mental gymnastics I'd go through to convert 
except/else/finally into "if" statements inside an __exit__.

Granted, if we end up with __enter__ and __exit__, I'll just write a 
resource mixin class whose __exit__ calls a stubbed-out __except__, 
__else__, and __finally__.  Then I won't have to figure out how to write 
__exit__ methods all the time.  Which is great for me, but I was thinking 
that this interface would reduce complexity for people trying to learn how 
to write these things.

I wasn't advocating this before because PEP 340's use of generators allowed 
you to directly use try/except/else/finally.  But, the new approach seems 
targeted at a wider set of use cases that don't include generators.  IOW, 
it's likely you'll be adding resource-finalization methods to actual 
resource classes, and grafting generators into them to implement 
__enter__/__exit__ seems more complex at first glance than just letting 
people add the methods directly; e.g.:

     def __enter__(self):
         self.handler = self._resource_control()
         return self.handler.__enter__()

     def __exit__(self):
         self.handler.__exit__()

     @with_template
     def _resource_control(self):
         f = self.open("blah")
         try:
             yield f
         finally:
             f.close()

versus this rather more "obvious way" to do it:

     def __try__(self):
         self.f = self.open("blah")
         return self.f

     def __finally__(self):
         self.f.close()

But I suppose you could encapsulate either pattern as a mixin class, so I 
suppose this could be treated as a matter for examples in documentation 
rather than as an implementation aspect.  It's just that if __exit__ has to 
probe exception state or other wizardry, it's going to be harder for 
non-wizards to use, and that's what I was reacting to here.   Anyway, I see 
now that documentation and simple mixins could address it, so if you think 
it's best handled that way, so be it.


>  I'm going for simple
>here unless there's a really strong use case. Anyway, a wrapped
>generator wrapper can't do with all those distinctions unless we
>augment the generator somehow ("continue EXPR" would suffice).

You'd use only __try__, __except__, and __else__ to wrap a generator.  For 
some other use cases you'd only use __try__ and __finally__, or __try__ and 
__except__, or __try__ and __else__.  I don't know of any use cases where 
you'd want to use all four simultaneously on the same controller.


>(Your decorator is equivalent to mine, but I don't like creating a new
>class each time.)

Mine was just a sketch to show the idea anyway; I'd be surprised if it 
doesn't have at least one bug.


From martin at v.loewis.de  Tue May 10 20:44:20 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Tue, 10 May 2005 20:44:20 +0200
Subject: [Python-Dev] Python's Unicode width default (New Py_UNICODE doc)
In-Reply-To: <428079B5.6010602@egenix.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>	<427BCD3B.1000201@egenix.com>	<427C029D.3090907@v.loewis.de>	<427D0E80.4080502@egenix.com>
	<427DD4EF.4030109@v.loewis.de>	<427F9007.3070603@egenix.com>
	<427FD6D8.2010003@v.loewis.de> <428079B5.6010602@egenix.com>
Message-ID: <42810104.3090303@v.loewis.de>

M.-A. Lemburg wrote:
> Martin, please reconsider... the choice is between:

The point is that this all was discussed, and decided the
other way 'round. There is no point in going back and forth
between the two choices:

http://mail.python.org/pipermail/python-dev/2003-June/036461.html

If we remove the code, people will *again* report that
_tkinter stops building on Redhat (see #719880). I
see no value in breaking what works now.

> a) We have a cross-platform default Unicode width
>    setting of UCS2.

It is hardly the default anymore cross-platform. Many
installations on Linux are built as UCS-4 now - no
matter what configure does.

> b) The default Unicode width is undefined and the only
>    thing we can tell the user is:
> 
>    Run the configure script and then try the interpreter
>    to check whether you've got a UCS2 or UCS4 build.

It's not at all undefined. There is a precise, deterministic,
repeatable algorithm that determines the default, and
if people want to know, we can tell them.

> I want to change the --enable-unicode switch back to
> always use UCS2 as default and add a new option value
> "tcl" which then triggers the behavior you've added to
> support _tkinter, ie.
> 
>     --enable-unicode=tcl
> 
> bases the decision to use UCS2 or UCS4 on the installed
> TCL interpreter (if there is one).

Please don't - unless you also go back and re-open the
bug reports, change the documentation, tell the Linux
packagers that settings have changed, and so on.

Why deliberately break what currently works?

Regards,
Martin

From nbastin at opnet.com  Tue May 10 20:48:19 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Tue, 10 May 2005 14:48:19 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <427EEE4F.40405@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
	<a39457493d9660bee8d1ace89067c990@opnet.com>
	<427C07C5.7060106@v.loewis.de>
	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>
	<427CC2C2.60000@v.loewis.de>
	<a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>
	<427DD8C1.4060109@v.loewis.de>
	<2bd4e2b88b94f48297ff0dcaef97a7c0@opnet.com>
	<427EEE4F.40405@v.loewis.de>
Message-ID: <793460514a90b965270ec08bc118a3fc@opnet.com>


On May 9, 2005, at 12:59 AM, Martin v. L?wis wrote:

>> Wow, what an inane way of looking at it.  I don't know what world you
>> live in, but in my world, users read the configure options and suppose
>> that they mean something.  In fact, they *have* to go off on their own
>> to assume something, because even the documentation you refer to above
>> doesn't say what happens if they choose UCS-2 or UCS-4.  A logical
>> assumption would be that python would use those CEFs internally, and
>> that would be incorrect.
>
> Certainly. That's why the documentation should be improved. Changing
> the option breaks existing packaging systems, and should not be done
> lightly.

I'm perfectly happy to continue supporting --enable-unicode=ucs2, but 
not displaying it as an option.  Is that acceptable to you?

--
Nick


From martin at v.loewis.de  Tue May 10 20:51:16 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Tue, 10 May 2005 20:51:16 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <42807C98.3090208@egenix.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>	<427BF842.8060604@hathawaymix.org>	<427C089F.2010808@v.loewis.de>	<427C76D6.4050409@hathawaymix.org>	<427CC3E3.4090405@v.loewis.de>	<427D024B.6080207@hathawaymix.org>	<427D0BDB.6050802@egenix.com>	<427DD377.6040401@v.loewis.de>
	<427F8D76.2060204@egenix.com> <427FD68E.6020400@v.loewis.de>
	<42807C98.3090208@egenix.com>
Message-ID: <428102A4.5070308@v.loewis.de>

M.-A. Lemburg wrote:
> If all you're interested in is the lexical class of the code points
> in a string, you could use such a codec to map each code point
> to a code point representing the lexical class.

How can I efficiently implement such a codec? The whole point is doing
that in pure Python (because if I had to write an extension module,
I could just as well do the entire lexical analysis in C, without
any regular expressions).

Any kind of associative/indexed table for this task consumes a lot
of memory, and takes quite some time to initialize.

Regards,
Martint

From martin at v.loewis.de  Tue May 10 20:58:21 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Tue, 10 May 2005 20:58:21 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <793460514a90b965270ec08bc118a3fc@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
	<a39457493d9660bee8d1ace89067c990@opnet.com>
	<427C07C5.7060106@v.loewis.de>
	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>
	<427CC2C2.60000@v.loewis.de>
	<a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>
	<427DD8C1.4060109@v.loewis.de>
	<2bd4e2b88b94f48297ff0dcaef97a7c0@opnet.com>
	<427EEE4F.40405@v.loewis.de>
	<793460514a90b965270ec08bc118a3fc@opnet.com>
Message-ID: <4281044D.8040203@v.loewis.de>

Nicholas Bastin wrote:
> I'm perfectly happy to continue supporting --enable-unicode=ucs2, but
> not displaying it as an option.  Is that acceptable to you?

It is. Somewhere, the code should say that this is for backwards
compatibility, of course (so people won't remove it too easily;
if there is a plan for obsoleting this setting, it should be
done in a phased manner).

Regards,
Martin

From firemoth at gmail.com  Wed May 11 01:28:21 2005
From: firemoth at gmail.com (Timothy Fitz)
Date: Tue, 10 May 2005 19:28:21 -0400
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <d5b7oj$uav$1@sea.gmane.org>
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
	<1f7befae05050411416c198c54@mail.gmail.com>
	<d5b7oj$uav$1@sea.gmane.org>
Message-ID: <972ec5bd05051016285954fec@mail.gmail.com>

> No, as except clauses can only occur before the finally clause, and execution
> should not go backwards.

This restriction feels a bit arbitrary. I can guarantee someone is
going to flatten this:

try:
    try:
        A
    finally:
        B
except IOError:
   C

A more flexible approach would be to allow finally at the beginning or
ending of the try statement. A more flexible approach would be to
allow both, or even finally clauses mixed in.

To me, that's the ugly portion of this proposal, it's quite arbitrary.
And the alternatives I posted have their own brands of ugly.
Concisely, this is an arbitrary shortcut for an idiom that already
exists. It seems to me that this shortcut would be redundant if PEP
340 or something with similar functionality was accepted.

From foom at fuhm.net  Wed May 11 01:34:30 2005
From: foom at fuhm.net (James Y Knight)
Date: Tue, 10 May 2005 19:34:30 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <793460514a90b965270ec08bc118a3fc@opnet.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
	<a39457493d9660bee8d1ace89067c990@opnet.com>
	<427C07C5.7060106@v.loewis.de>
	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>
	<427CC2C2.60000@v.loewis.de>
	<a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>
	<427DD8C1.4060109@v.loewis.de>
	<2bd4e2b88b94f48297ff0dcaef97a7c0@opnet.com>
	<427EEE4F.40405@v.loewis.de>
	<793460514a90b965270ec08bc118a3fc@opnet.com>
Message-ID: <8084AB38-5AE6-42BE-85B8-E2B5E9849AF1@fuhm.net>


On May 10, 2005, at 2:48 PM, Nicholas Bastin wrote:
> On May 9, 2005, at 12:59 AM, Martin v. L?wis wrote:
>
>
>>> Wow, what an inane way of looking at it.  I don't know what world  
>>> you
>>> live in, but in my world, users read the configure options and  
>>> suppose
>>> that they mean something.  In fact, they *have* to go off on  
>>> their own
>>> to assume something, because even the documentation you refer to  
>>> above
>>> doesn't say what happens if they choose UCS-2 or UCS-4.  A logical
>>> assumption would be that python would use those CEFs internally, and
>>> that would be incorrect.
>>>
>>
>> Certainly. That's why the documentation should be improved. Changing
>> the option breaks existing packaging systems, and should not be done
>> lightly.
>>
>
> I'm perfectly happy to continue supporting --enable-unicode=ucs2,  
> but not displaying it as an option.  Is that acceptable to you?
>

If you're going to call python's implementation UTF-16, I'd consider  
all these very serious deficiencies:
- unicodedata doesn't work for 2-char strings containing a surrogate  
pairs, nor integers. Therefore it is impossible to get any data on  
chars > 0xFFFF.
- there are no methods for determining if something is a surrogate  
pair and turning it into a integer codepoint.
- Given that unicodedata doesn't work, I doubt also that .toupper/etc  
work right on surrogate pairs, although I haven't tested.
- As has been noted before, the regexp engine doesn't properly treat  
surrogate pairs as a single unit.
- Is there a method that is like unichr but that will work for  
codepoints > 0xFFFF?

I'm sure there's more as well. I think it's a mistake to consider  
python to be implementing UTF-16 just because it properly encodes/ 
decodes surrogate pairs in the UTF-8 codec.

James

From ssouhlal at FreeBSD.org  Mon May  9 20:36:14 2005
From: ssouhlal at FreeBSD.org (Suleiman Souhlal)
Date: Mon, 9 May 2005 14:36:14 -0400
Subject: [Python-Dev] Python continually calling sigprocmask() on FreeBSD 5
Message-ID: <FD8DE27E-C90C-4756-AED9-F4529395294D@FreeBSD.org>

Hello,

While investigating why the script used in http://docs.freebsd.org/ 
cgi/getmsg.cgi?fetch=148191+0+current/freebsd-stable used so much  
system time on FreeBSD 5, I noticed that python is continually  
calling sigprocmask(), as can be seen from the following ktrace(1) dump:

    673 python   0.000007 CALL  sigprocmask(0x3,0,0x811d11c)
    673 python   0.000005 RET   sigprocmask 0
    673 python   0.000009 CALL  sigprocmask(0x1,0,0x8113d1c)
    673 python   0.000005 RET   sigprocmask 0
etc..

This is using Python 2.4.1.
Any clue about why this is happening?
(Please include me to the recipients for any reply, as I'm not  
subscribed)

Bye,
--
Suleiman Souhlal     | ssouhlal at vt.edu
The FreeBSD Project  | ssouhlal at FreeBSD.org


From gvanrossum at gmail.com  Wed May 11 05:19:34 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 10 May 2005 20:19:34 -0700
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <972ec5bd05051016285954fec@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
	<1f7befae05050411416c198c54@mail.gmail.com>
	<d5b7oj$uav$1@sea.gmane.org>
	<972ec5bd05051016285954fec@mail.gmail.com>
Message-ID: <ca471dc205051020194d39ae49@mail.gmail.com>

[Timothy Fitz]
> A more flexible approach would be to allow finally at the beginning or
> ending of the try statement. A more flexible approach would be to
> allow both, or even finally clauses mixed in.

-1000.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From kbk at shore.net  Wed May 11 05:43:45 2005
From: kbk at shore.net (Kurt B. Kaiser)
Date: Tue, 10 May 2005 23:43:45 -0400 (EDT)
Subject: [Python-Dev] Weekly Python Patch/Bug Summary
Message-ID: <200505110343.j4B3hjB1003013@bayview.thirdcreek.com>

Patch / Bug Summary
___________________

Patches :  332 open (+10) /  2834 closed ( +2) /  3166 total (+12)
Bugs    :  927 open ( +7) /  4959 closed ( +7) /  5886 total (+14)
RFE     :  186 open ( +0) /   157 closed ( +1) /   343 total ( +1)

New / Reopened Patches
______________________

Feature enhancement for C socket module  (2005-05-03)
       http://python.org/sf/1194378  opened by  Heiko Wundram

pydoc requires o.__nonzero__() = True  (2005-05-03)
       http://python.org/sf/1194449  opened by  Jay T Miller

[AST] throw SyntaxError in "from x import y,"  (2005-05-03)
       http://python.org/sf/1194895  opened by  logistix

simple callback system for Py_FatalError   (2005-05-05)
       http://python.org/sf/1195571  opened by  m utku

in IDLE, assume new text files are python source by default  (2005-05-06)
       http://python.org/sf/1196895  opened by  Jeff Shute

smarter behaviour for home key in IDLE  (2005-05-06)
       http://python.org/sf/1196903  opened by  Jeff Shute

change recall in IDLE shell to not overwrite current command  (2005-05-06)
       http://python.org/sf/1196917  opened by  Jeff Shute

allow using normal indent width in shell in IDLE  (2005-05-06)
       http://python.org/sf/1196946  opened by  Jeff Shute

_ssl.mak Makefile patch (Win32)  (2005-05-07)
       http://python.org/sf/1197150  opened by  Joachim Kessel

Add proxies arg to urllib.urlretrieve  (2005-05-07)
       http://python.org/sf/1197207  opened by  John Dubery

test_locale fix on modern linux  (2005-05-07)
       http://python.org/sf/1197218  opened by  Anthony Baxter

Cygwin case-sensitive import patch  (2005-05-07)
       http://python.org/sf/1197318  opened by  Jason Tishler

Patches Closed
______________

Minimal cleanup of run.py  (2005-04-26)
       http://python.org/sf/1190163  closed by  kbk

Decimal interaction with __rop__  (2005-03-19)
       http://python.org/sf/1166602  closed by  facundobatista

New / Reopened Bugs
___________________

parsedate and Y2K  (2005-05-02)
       http://python.org/sf/1194222  opened by  Mark Nottingham

Minor bug in urllib docs  (2005-05-03)
       http://python.org/sf/1194249  opened by  Georg Brandl

Reading from a killed shell script with popen* under linux  (2005-05-03)
       http://python.org/sf/1194328  opened by  Vinz

ImportError: No module named os  (2005-05-03)
CLOSED http://python.org/sf/1194497  opened by  Will L G

[AST] Patch [ 1190012 ] should've checked for SyntaxWarnings  (2005-05-04)
       http://python.org/sf/1195576  opened by  logistix

SystemError: error return without exception set  (2005-05-05)
       http://python.org/sf/1195984  opened by  Niraj Bajpai

Error: ... ossaudiodev.c, line 48: Missing type specifier  (2005-05-05)
       http://python.org/sf/1196154  opened by  Will L G

WeakValueDictionary.__init__ is backwards  (2005-05-05)
       http://python.org/sf/1196315  opened by  Pavel Pergamenshchik

string.rstrip strips more than supposed to in some cases  (2005-05-06)
CLOSED http://python.org/sf/1196824  opened by  Francois Payette

trivial bug in error message text  (2005-05-06)
       http://python.org/sf/1196980  opened by  Jeff Shute

% gives wrong results  (2005-05-08)
CLOSED http://python.org/sf/1197806  opened by  Jonathan

Installation path sent to configure  (2005-05-09)
       http://python.org/sf/1197883  opened by  Bj?rn Lindqvist

time module ignores timezone changes  (2005-05-09)
CLOSED http://python.org/sf/1198275  opened by  David Lambert

string.Template not flexible enough to subclass (regexes)  (2005-05-09)
       http://python.org/sf/1198569  opened by  Ian Bicking

subprocess _active.remove(self) self not in list _active  (2005-05-10)
       http://python.org/sf/1199282  opened by  cheops

Bugs Closed
___________

SimpleHTTPServer sends wrong c-length and locks up client  (2005-04-26)
       http://python.org/sf/1190580  closed by  alexanderweb

ImportError: No module named os  (2005-05-03)
       http://python.org/sf/1194497  closed by  rhettinger

Error in section 4.2 of Python Tutorial  (2005-05-02)
       http://python.org/sf/1194209  closed by  rhettinger

Error in section 4.2 of Python Tutorial  (2005-05-02)
       http://python.org/sf/1194209  closed by  rhettinger

string.rstrip strips more than supposed to in some cases  (2005-05-06)
       http://python.org/sf/1196824  closed by  goodger

% gives wrong results  (2005-05-08)
       http://python.org/sf/1197806  closed by  mwh

time module ignores timezone changes  (2005-05-09)
       http://python.org/sf/1198275  closed by  bcannon

calendar.weekheader not found in __all__  (2005-05-02)
       http://python.org/sf/1193890  closed by  rhettinger

RFE Closed
__________

logging module root logger name  (2005-04-27)
       http://python.org/sf/1190689  closed by  vsajip


From greg.ewing at canterbury.ac.nz  Wed May 11 06:41:43 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 11 May 2005 16:41:43 +1200
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc2050509215823876c50@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
Message-ID: <42818D07.8020303@canterbury.ac.nz>

Guido van Rossum wrote:

> Now it would make more sense to change the syntax to
> 
>         with EXPR as VAR:
>             BLOCK
> 
> and we have Phillip Eby's proposal.

Change the 'with' to 'do' and I'd be +1 on this.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Wed May 11 06:41:49 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 11 May 2005 16:41:49 +1200
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
 either a competitor or update to PEP 340)
In-Reply-To: <4280815C.2050007@gmail.com>
References: <20050507140547.64F0.JCARLSON@uci.edu>
	<427D446B.3080707@ronadam.com> <20050507210108.64F4.JCARLSON@uci.edu>
	<427DB731.8060604@ronadam.com> <427F032B.2010402@canterbury.ac.nz>
	<427F5BBE.3050403@gmail.com> <42802643.6060705@canterbury.ac.nz>
	<4280815C.2050007@gmail.com>
Message-ID: <42818D0D.7070103@canterbury.ac.nz>

Nick Coghlan wrote:
> Hmm, with that approach, a code inspection tool like pychecker could be used to 
> pick up the slack, and flag generators which have a yield inside a try/finally 
> or a user defined statement without applying the "needs finalisation" decorator 

What about giving them an __exit__ method if and only
if they have a yield inside a try/finally? Old generators
won't be doing that, because it's currently illegal.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From tdelaney at avaya.com  Wed May 11 06:55:12 2005
From: tdelaney at avaya.com (Delaney, Timothy C (Timothy))
Date: Wed, 11 May 2005 14:55:12 +1000
Subject: [Python-Dev] PEP 340: Deterministic Finalisation (new PEP draft,
	either a competitor or update to PEP 340)
Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE02520502@au3010avexu1.global.avaya.com>

Greg Ewing wrote:

> Nick Coghlan wrote:
>> Hmm, with that approach, a code inspection tool like pychecker could
>> be used to pick up the slack, and flag generators which have a yield
>> inside a try/finally or a user defined statement without applying
>> the "needs finalisation" decorator 
> 
> What about giving them an __exit__ method if and only
> if they have a yield inside a try/finally? Old generators
> won't be doing that, because it's currently illegal.

It's possible to create a generator that does not contain a finally, but
still needs cleanup.

    def gen():
        try:
            yield
        except:
            print 'cleanup'
            raise

Tim Delaney

From nbastin at opnet.com  Wed May 11 09:18:59 2005
From: nbastin at opnet.com (Nicholas Bastin)
Date: Wed, 11 May 2005 03:18:59 -0400
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <8084AB38-5AE6-42BE-85B8-E2B5E9849AF1@fuhm.net>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>
	<a39457493d9660bee8d1ace89067c990@opnet.com>
	<427C07C5.7060106@v.loewis.de>
	<dcb880b2b0bee21478dcfebe3070302e@opnet.com>
	<427CC2C2.60000@v.loewis.de>
	<a9d5d2a422db34edd3c79666efdbe0d7@opnet.com>
	<427DD8C1.4060109@v.loewis.de>
	<2bd4e2b88b94f48297ff0dcaef97a7c0@opnet.com>
	<427EEE4F.40405@v.loewis.de>
	<793460514a90b965270ec08bc118a3fc@opnet.com>
	<8084AB38-5AE6-42BE-85B8-E2B5E9849AF1@fuhm.net>
Message-ID: <6d44087494ad67ff600bd3e8d6c14eca@opnet.com>


On May 10, 2005, at 7:34 PM, James Y Knight wrote:
> If you're going to call python's implementation UTF-16, I'd consider 
> all these very serious deficiencies:

The --enable-unicode option declares a character encoding form (CEF), 
not a character encoding scheme (CES).  It is unfortunate that UTF-16 
is a valid option for both of these things, but supporting the CEF does 
not imply supporting the CES.  All of your complaints would be valid if 
we claimed that Python supported the UTF-16 CES, but the language 
itself only needs to support a CEF that everyone understands how to 
work with.

It is widely recognized, I believe, that the general level of unicode 
support exposed to Python users is somewhat lacking when it comes to 
high surrogate pairs.  I'd love for us to fix that problem, or, better 
yet, integrate something like ICU, but this isn't that discussion.

> - unicodedata doesn't work for 2-char strings containing a surrogate 
> pairs, nor integers. Therefore it is impossible to get any data on 
> chars > 0xFFFF.
> - there are no methods for determining if something is a surrogate 
> pair and turning it into a integer codepoint.
> - Given that unicodedata doesn't work, I doubt also that .toupper/etc 
> work right on surrogate pairs, although I haven't tested.
> - As has been noted before, the regexp engine doesn't properly treat 
> surrogate pairs as a single unit.
> - Is there a method that is like unichr but that will work for 
> codepoints > 0xFFFF?
>
> I'm sure there's more as well. I think it's a mistake to consider 
> python to be implementing UTF-16 just because it properly 
> encodes/decodes surrogate pairs in the UTF-8 codec.

Users should understand (and we should write doc to help them 
understand), that using 2-byte wide unicode support in Python means 
that all operations will be done on Code Units, and not Code Points.  
Once you understand this, you can work with the data that is given to 
you, although it's certainly not as nice as what you would have come to 
expect from Python.  (For example, you can correctly construct a regexp 
to find the surrogate pair you're looking for by using the constituent 
code units).

--
Nick


From mal at egenix.com  Wed May 11 10:25:46 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Wed, 11 May 2005 10:25:46 +0200
Subject: [Python-Dev] New Py_UNICODE doc
In-Reply-To: <428102A4.5070308@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<9772ff3ac8afbd8d4451968a065e281b@opnet.com>	<49df26051bfb4ade3a00ec2fac9d02e6@fuhm.net>	<a39457493d9660bee8d1ace89067c990@opnet.com>	<427BDFF4.9030900@hathawaymix.org>	<851aac706e3951c4c7c5e6e5467eafff@opnet.com>	<427BF842.8060604@hathawaymix.org>	<427C089F.2010808@v.loewis.de>	<427C76D6.4050409@hathawaymix.org>	<427CC3E3.4090405@v.loewis.de>	<427D024B.6080207@hathawaymix.org>	<427D0BDB.6050802@egenix.com>	<427DD377.6040401@v.loewis.de>	<427F8D76.2060204@egenix.com>
	<427FD68E.6020400@v.loewis.de>	<42807C98.3090208@egenix.com>
	<428102A4.5070308@v.loewis.de>
Message-ID: <4281C18A.1060201@egenix.com>

Martin v. L?wis wrote:
> M.-A. Lemburg wrote:
> 
>>If all you're interested in is the lexical class of the code points
>>in a string, you could use such a codec to map each code point
>>to a code point representing the lexical class.
> 
> 
> How can I efficiently implement such a codec? The whole point is doing
> that in pure Python (because if I had to write an extension module,
> I could just as well do the entire lexical analysis in C, without
> any regular expressions).

You can write such a codec in Python, but C will of course
be more efficient. The whole point is that for things that
you will likely use a lot in your application, it is better
to have one efficient implementation than dozens of
duplicate re character sets embedded in compiled re-expressions.

> Any kind of associative/indexed table for this task consumes a lot
> of memory, and takes quite some time to initialize.

Right - which is why an algorithmic approach will always
be more efficient (in terms of speed/memory tradeoff)
and these *can* support surrogates.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 11 2005)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From mwh at python.net  Wed May 11 11:29:42 2005
From: mwh at python.net (Michael Hudson)
Date: Wed, 11 May 2005 10:29:42 +0100
Subject: [Python-Dev] Python continually calling sigprocmask() on
 FreeBSD 5
In-Reply-To: <FD8DE27E-C90C-4756-AED9-F4529395294D@FreeBSD.org> (Suleiman
	Souhlal's message of "Mon, 9 May 2005 14:36:14 -0400")
References: <FD8DE27E-C90C-4756-AED9-F4529395294D@FreeBSD.org>
Message-ID: <2mfywut8jd.fsf@starship.python.net>

Suleiman Souhlal <ssouhlal at FreeBSD.org> writes:

> Hello,
>
> While investigating why the script used in http://docs.freebsd.org/ 
> cgi/getmsg.cgi?fetch=148191+0+current/freebsd-stable used so much  
> system time on FreeBSD 5, I noticed that python is continually  
> calling sigprocmask(), as can be seen from the following ktrace(1) dump:
>
>     673 python   0.000007 CALL  sigprocmask(0x3,0,0x811d11c)
>     673 python   0.000005 RET   sigprocmask 0
>     673 python   0.000009 CALL  sigprocmask(0x1,0,0x8113d1c)
>     673 python   0.000005 RET   sigprocmask 0
> etc..
>
> This is using Python 2.4.1.
> Any clue about why this is happening?

In a word, no.

As far as I am aware, there are no calls whatsoever to sigprocmask in
Python 2.4.1 (there were in 2.3.X, but these were connected to threads
and the mentioned script doesn't use them).  So you need to do some
digging of your own.

> (Please include me to the recipients for any reply, as I'm not  
> subscribed)

OK, but there's always Gmane if you just want to watch python-dev for
a bit.

Cheers,
mwh

-- 
    . <- the point                                your article -> .
    |------------------------- a long way ------------------------|
                                       -- Christophe Rhodes, ucam.chat

From ncoghlan at gmail.com  Wed May 11 12:41:16 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 11 May 2005 20:41:16 +1000
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
References: <5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>	<ca471dc2050509215823876c50@mail.gmail.com>	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
Message-ID: <4281E14C.1070405@gmail.com>

Phillip J. Eby wrote:
> Of course, it's not *really* that simple, because __try__ doesn't exactly 
> correspond to 'try:', and it has to return something, but it sure is 
> simpler than the mental gymnastics I'd go through to convert 
> except/else/finally into "if" statements inside an __exit__.

You don't need to make that translation, though. Instead, you can just reraise 
the passed in exception inside the __exit__() method:

   def __exit__(self, *exc_info):
       try:
           try:
               if exc_info:
                   raise exc_info[0], exc_info[1], exc_info[2]
           except:
               pass
           else:
               pass
       finally:
           pass

However, the __exit__() method does allow you to switch to using if statements 
if that makes more sense (or would be more efficient). For instance, these are 
possible __exit__ methods for a locking() statement template and a transaction() 
statement template:

   # locking's exit method
   def __exit__(self, *exc_info):
       self.lock.release()
       if exc_info:
           raise exc_info[0], exc_info[1], exc_info[2]

   # transaction's exit method
   def __exit__(self, *exc_info):
       if exc_info:
           self.db.rollback()
           raise exc_info[0], exc_info[1], exc_info[2]
       else:
           self.db.commit()


I've posted draft 1.4 of my PEP 310/PEP 340 merger PEP (PEP 650, maybe?):
http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

This version cleans up the semantics a lot, so that the examples actually work 
as intended, and there is One Obvious Way to do things like suppressing 
exceptions (i.e. don't reraise them in the __exit__() method). It also 
specifically addresses the question of using two methods in the protocol versus 
four, and shows how an arbitrary try statement can be converted to a statement 
template.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From p.f.moore at gmail.com  Wed May 11 14:36:15 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Wed, 11 May 2005 13:36:15 +0100
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <4281E14C.1070405@gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
Message-ID: <79990c6b05051105361e7a9ba0@mail.gmail.com>

On 5/11/05, Nick Coghlan <ncoghlan at gmail.com> wrote:
> I've posted draft 1.4 of my PEP 310/PEP 340 merger PEP (PEP 650, maybe?):
> http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

I've been skipping the discussion, but this is starting to look pretty
good. I'll give it a proper read soon. However, one thing immediately
struck me: if __exit__ gets an exception and does not re-raise it, it
is silently ignored. This seems like a bad idea - the usual "errors
should not pass silently" applies. I can very easily imagine statement
templates accidentally eating KeyboardInterrupt or SystemExit
exceptions.

At the very least, there should be a section in "rejected
alternatives" explaining why it is not appropriate to force reraising
of exceptions unless explicit action is taken. There could be good
reasons (as I say, I haven't followed the discussion) but they should
be recorded. And if there aren't any good reasons, this behaviour
should probably be changed.

Paul.

PS Apologies if I missed the discussion of this in the PEP - as I say,
I've only skimmed it so far.

From andymac at bullseye.apana.org.au  Wed May 11 12:51:27 2005
From: andymac at bullseye.apana.org.au (Andrew MacIntyre)
Date: Wed, 11 May 2005 21:51:27 +1100
Subject: [Python-Dev] Python continually calling sigprocmask() on
 FreeBSD 5
In-Reply-To: <2mfywut8jd.fsf@starship.python.net>
References: <FD8DE27E-C90C-4756-AED9-F4529395294D@FreeBSD.org>
	<2mfywut8jd.fsf@starship.python.net>
Message-ID: <4281E3AF.9040101@bullseye.apana.org.au>

Michael Hudson wrote:
> Suleiman Souhlal <ssouhlal at FreeBSD.org> writes:
>
>>While investigating why the script used in http://docs.freebsd.org/ 
>>cgi/getmsg.cgi?fetch=148191+0+current/freebsd-stable used so much  
>>system time on FreeBSD 5, I noticed that python is continually  
>>calling sigprocmask(), as can be seen from the following ktrace(1) dump:
>>
>>    673 python   0.000007 CALL  sigprocmask(0x3,0,0x811d11c)
>>    673 python   0.000005 RET   sigprocmask 0
>>    673 python   0.000009 CALL  sigprocmask(0x1,0,0x8113d1c)
>>    673 python   0.000005 RET   sigprocmask 0
>>etc..
>>
>>This is using Python 2.4.1.
>>Any clue about why this is happening?
> 
> 
> In a word, no.
> 
> As far as I am aware, there are no calls whatsoever to sigprocmask in
> Python 2.4.1 (there were in 2.3.X, but these were connected to threads
> and the mentioned script doesn't use them).  So you need to do some
> digging of your own.

As I noted in a followup to the FreeBSD list where this came up, this
appears to be a consequence of FreeBSD's Python port being configure'ed
with the "-with-fpectl" option.  This option uses setjmp()/longjump()
around floating point ops for FP exception management, and it is the
setjmp()/longjmp() in the FreeBSD threaded C libs that calls
sigprocmask() as above.

This option was certainly required for FreeBSD 4.x and earlier, where
SIGFPE wasn't masked by default (at least that's my understanding of
why the option was required...).  I have the understanding that starting
with 5.0 or 5.1, FreeBSD now does mask SIGFPE by default and so this
option may not be necessary in this environment.  A FreeBSD list member
with a 5.3 system tested the microbenchmark that exposed the issue with
a Python interpreter built without "-with-fpectl" and the performance
issue disappeared.  It is not yet clear whether it is appropriate that
the port be changed to drop this configure option.

Regards,
Andrew.

-------------------------------------------------------------------------
Andrew I MacIntyre                     "These thoughts are mine alone..."
E-mail: andymac at bullseye.apana.org.au  (pref) | Snail: PO Box 370
        andymac at pcug.org.au             (alt) |        Belconnen ACT 2616
Web:    http://www.andymac.org/               |        Australia

From sselva at midascomm.com  Wed May 11 13:58:10 2005
From: sselva at midascomm.com (duraivel)
Date: Wed, 11 May 2005 17:28:10 +0530
Subject: [Python-Dev] Re: Kernel panic writing to /dev/dsp with cmpci driver
Message-ID: <000501c55620$b535a0a0$9e97fea9@duraivel>

thanx and regards
duraivel
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.python.org/pipermail/python-dev/attachments/20050511/924a3922/attachment.htm

From aahz at pythoncraft.com  Wed May 11 16:36:38 2005
From: aahz at pythoncraft.com (Aahz)
Date: Wed, 11 May 2005 07:36:38 -0700
Subject: [Python-Dev] CHANGE BayPIGgies: May *THIRD* Thurs
Message-ID: <20050511143638.GA14331@panix.com>

Reminder:

We will *NOT* be meeting the *SECOND* Thursday (this week, May 12).

Our May meeting will be the *THIRD* Thursday, May 19.  This will be our
first meeting at Google, with Alex Martelli's presention on design
patterns.  More details soon!
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"And if that makes me an elitist...I couldn't be happier."  --JMS

From gvanrossum at gmail.com  Wed May 11 19:00:19 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Wed, 11 May 2005 10:00:19 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <42818D07.8020303@canterbury.ac.nz>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<42818D07.8020303@canterbury.ac.nz>
Message-ID: <ca471dc2050511100011cdcfcb@mail.gmail.com>

[Guido]
> > Now it would make more sense to change the syntax to
> >
> >         with EXPR as VAR:
> >             BLOCK
> >
> > and we have Phillip Eby's proposal.

[Greg]
> Change the 'with' to 'do' and I'd be +1 on this.

Great! But the devil is in the details. I want to reduce the
complexity, and I'm willing to reduce the functionality somewhat. It
would help if you could support this.

In particular, while I like the use case transactional() (example 3 in
PEP 340) enough to want some indicator of success or failure, I don't
see the point of having separate __except__, __else__ and __finally__
entry points.  It's unclear how these would be mapped to the generator
API, and whether more than one could be called e.g. what if __else__
raises an exception itself -- will __finally__ be called?  I'm sure
that could be specified rigorously, but I'm not so sure that it is
going to be useful and clear.

I see several possible levels of information that could be passed to
the __exit__ call:

(1) None. This is insufficient for the transactional() use case.

(2) Only a bool indicating success or failure. This is sufficient for
the transactional() use case.

(3) Full exception information, with the understanding that when
__exit__() returns normally, exception processing will resume as usual
(i.e. __exit__() is called from a finally clause). Exceptions raised
from __exit__() are considered errors/bugs in __exit__() and should be
avoided by robust __exit__() methods.

(4) Like (3), but also distinguish between non-local gotos
(break/continue/return), exceptions, and normal completion of BLOCK.
(I'm not sure that this is really different from (3).)

What do you think?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From barry at python.org  Wed May 11 19:20:01 2005
From: barry at python.org (Barry Warsaw)
Date: Wed, 11 May 2005 13:20:01 -0400
Subject: [Python-Dev] Python 2.4 set objects and cyclic garbage
Message-ID: <1115832001.10867.61.camel@geddy.wooz.org>

In setobject.c rev 1.26 + 1.27 Raymond removed gc support from built-in
set objects, claiming in the log message that "sets cannot have
cycles".  Yet the attached program creates a cycle that I don't believe
ever gets reclaimed.

Patch 1200018 restores GC support to set objects for Python 2.4.  Thanks
to my colleague Matt Messier for finding this.

kick-me-if-i'm-smoking-something-ly y'rs,
-Barry

http://sourceforge.net/tracker/index.php?func=detail&aid=1200018&group_id=5470&atid=305470

-------------- next part --------------
A non-text attachment was scrubbed...
Name: cycle.py
Type: application/x-python
Size: 367 bytes
Desc: not available
Url : http://mail.python.org/pipermail/python-dev/attachments/20050511/afd2cb24/cycle.bin
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 307 bytes
Desc: This is a digitally signed message part
Url : http://mail.python.org/pipermail/python-dev/attachments/20050511/afd2cb24/attachment.pgp

From pje at telecommunity.com  Wed May 11 19:43:26 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Wed, 11 May 2005 13:43:26 -0400
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc2050511100011cdcfcb@mail.gmail.com>
References: <42818D07.8020303@canterbury.ac.nz>
	<ca471dc2050509215823876c50@mail.gmail.com>
	<42818D07.8020303@canterbury.ac.nz>
Message-ID: <5.1.1.6.0.20050511132607.01f66da8@mail.telecommunity.com>

At 10:00 AM 5/11/2005 -0700, Guido van Rossum wrote:
>(3) Full exception information, with the understanding that when
>__exit__() returns normally, exception processing will resume as usual
>(i.e. __exit__() is called from a finally clause). Exceptions raised
>from __exit__() are considered errors/bugs in __exit__() and should be
>avoided by robust __exit__() methods.

FYI, there are still use cases for clearing the exception state in an 
__exit__ method, that might justify allowing a true return from __exit__ to 
suppress the error.  e.g.:


     class Attempt(object):
         def __init__(self,type,last=False):
             self.type = type
             self.last = last
         def __enter__(self): pass
         def __exit__(self,*exc):
             if exc and not last and issubclass(exc[0],self.type):
                 # suppress exception
                 return True

     def retry(count, type=Exception):
         attempt = Attempt(type)
         for i in range(count-1):
             yield attempt
         yield Attempt(type, True)

     # usage:

     for attempt in retry(3):
         do attempt:
            somethingThatCouldFail()

and:

     class logging_exceptions(object):
         def __init__(self,logger): self.logger = logger
         def __enter__(self): pass
         def __exit__(self,*exc):
             if exc:
                 # log and suppress error
                 self.logger.error("Unexpected error", exc_info=exc)
                 return True

     while True:
         do logging_exceptions(my_logger):
             processEvents()


From gvanrossum at gmail.com  Wed May 11 19:42:27 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Wed, 11 May 2005 10:42:27 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <5.1.1.6.0.20050511132607.01f66da8@mail.telecommunity.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<42818D07.8020303@canterbury.ac.nz>
	<ca471dc2050511100011cdcfcb@mail.gmail.com>
	<5.1.1.6.0.20050511132607.01f66da8@mail.telecommunity.com>
Message-ID: <ca471dc205051110429ef4f56@mail.gmail.com>

[Phillip J. Eby]
> FYI, there are still use cases for clearing the exception state in an
> __exit__ method, that might justify allowing a true return from __exit__ to
> suppress the error.  e.g.:
[...]

Yes, but aren't those written clearer using an explicit try/except?
IMO anything that actually stops an exception from propagating outward
is worth an explicit try/except clause, so the reader knows what is
happening.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Wed May 11 20:05:15 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Wed, 11 May 2005 14:05:15 -0400
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc205051110429ef4f56@mail.gmail.com>
References: <5.1.1.6.0.20050511132607.01f66da8@mail.telecommunity.com>
	<ca471dc2050509215823876c50@mail.gmail.com>
	<42818D07.8020303@canterbury.ac.nz>
	<ca471dc2050511100011cdcfcb@mail.gmail.com>
	<5.1.1.6.0.20050511132607.01f66da8@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050511135431.02056c08@mail.telecommunity.com>

At 10:42 AM 5/11/2005 -0700, Guido van Rossum wrote:
>[Phillip J. Eby]
> > FYI, there are still use cases for clearing the exception state in an
> > __exit__ method, that might justify allowing a true return from __exit__ to
> > suppress the error.  e.g.:
>[...]
>
>Yes, but aren't those written clearer using an explicit try/except?
>IMO anything that actually stops an exception from propagating outward
>is worth an explicit try/except clause, so the reader knows what is
>happening.

I thought the whole point of PEP 340 was to allow abstraction and reuse of 
patterns that currently use "try" blocks, including "except" as well as 
"finally".  So, if you're only going to allow try/finally abstraction, 
wouldn't it make more sense to call it __finally__ instead of 
__exit__?  That would make it clearer that this is purely for try/finally 
patterns, and not error handling patterns.

As for whether they're written more clearly using an explicit try/except, I 
don't know.  Couldn't you say exactly the same thing about explicit 
try/finally?  For that matter, if you used function calls, doesn't it 
produce the same issue, e.g.:

     def retry(count,exc_type=Exception):
         def attempt(func):
             try: func()
             except exc_type: pass
         for i in range(count-1):
             yield attempt
         yield lambda f: f()

     for attempt in retry(3):
         attempt(somethingThatMightFail)

Is this bad style too?


From gvanrossum at gmail.com  Wed May 11 20:23:38 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Wed, 11 May 2005 11:23:38 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <5.1.1.6.0.20050511135431.02056c08@mail.telecommunity.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<42818D07.8020303@canterbury.ac.nz>
	<ca471dc2050511100011cdcfcb@mail.gmail.com>
	<5.1.1.6.0.20050511132607.01f66da8@mail.telecommunity.com>
	<ca471dc205051110429ef4f56@mail.gmail.com>
	<5.1.1.6.0.20050511135431.02056c08@mail.telecommunity.com>
Message-ID: <ca471dc2050511112374272cc@mail.gmail.com>

> >[Phillip J. Eby]
> > > FYI, there are still use cases for clearing the exception state in an
> > > __exit__ method, that might justify allowing a true return from __exit__ to
> > > suppress the error.  e.g.:
> >[...]

[Guido]
> >Yes, but aren't those written clearer using an explicit try/except?
> >IMO anything that actually stops an exception from propagating outward
> >is worth an explicit try/except clause, so the reader knows what is
> >happening.

[Phillip]
> I thought the whole point of PEP 340 was to allow abstraction and reuse of
> patterns that currently use "try" blocks, including "except" as well as
> "finally".

Indeed it was. But I'm getting a lot of pushback on the PEP so I'm
exploring a simpler proposal with a more limited use case -- that of
PEP 310, basically. Note that generators written for this new proposal
do not contain try/finally or try/except (and don't need it); they
simply contain some actions before the yield and some actions after
it, and the with/do/block/stmt statement takes care of calling it.

>  So, if you're only going to allow try/finally abstraction,
> wouldn't it make more sense to call it __finally__ instead of
> __exit__?  That would make it clearer that this is purely for try/finally
> patterns, and not error handling patterns.

I don't think the name matteres that much; __exit__ doesn't
particularly mean "error handling" to me. I think PEP 310 proposed
__enter__ and __exit__ and I was just following that; I've also
thought of __enter__/__leave__ or even the nostalgic
__begin__/__end__.

> As for whether they're written more clearly using an explicit try/except, I
> don't know.  Couldn't you say exactly the same thing about explicit
> try/finally?

For try/finally we have a large body of use cases that just scream for
abstraction. I'm not convinced that we have the same for try/except.

Maybe the key is this: with try/finally, the control flow is
unaffected whether the finally clause is present or not, so hiding it
from view doesn't matter much for understanding the code; in fact in
my mind when I see a try/finally clause I mentally translate it to
something that says "that resource is held for the duration of this
block" so I can stop thinking about the details of releasing that
resource. try/except, on the other hand, generally changes the control
flow, and there is much more variety in the except clause. I don't
think the need for abstraction is the same.

>  For that matter, if you used function calls, doesn't it
> produce the same issue, e.g.:
> 
>      def retry(count,exc_type=Exception):
>          def attempt(func):
>              try: func()
>              except exc_type: pass
>          for i in range(count-1):
>              yield attempt
>          yield lambda f: f()
> 
>      for attempt in retry(3):
>          attempt(somethingThatMightFail)
> 
> Is this bad style too?

Yes (not to mention that the retry def is unreadable and that the
'attempt' callable feels magical). There are many ways of coding retry
loops and seeing the bottom two lines (the use) in isolation doesn't
give me an idea of what happens when the 3rd attempt fails. Here,
EIBTI.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From shane at hathawaymix.org  Wed May 11 20:44:19 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Wed, 11 May 2005 12:44:19 -0600
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <5.1.1.6.0.20050511132607.01f66da8@mail.telecommunity.com>
References: <42818D07.8020303@canterbury.ac.nz>	<ca471dc2050509215823876c50@mail.gmail.com>	<42818D07.8020303@canterbury.ac.nz>
	<5.1.1.6.0.20050511132607.01f66da8@mail.telecommunity.com>
Message-ID: <42825283.4020802@hathawaymix.org>

Phillip J. Eby wrote:
> FYI, there are still use cases for clearing the exception state in an 
> __exit__ method, that might justify allowing a true return from __exit__ to 
> suppress the error.  e.g.:

Maybe __exit__ could suppress exceptions using a new idiom:

         def __exit__(self,*exc):
             if exc and not last and issubclass(exc[0],self.type):
                 # suppress the exception
                 raise None

This seems clearer than "return True".

Shane

From pje at telecommunity.com  Wed May 11 20:53:30 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Wed, 11 May 2005 14:53:30 -0400
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <42825283.4020802@hathawaymix.org>
References: <5.1.1.6.0.20050511132607.01f66da8@mail.telecommunity.com>
	<42818D07.8020303@canterbury.ac.nz>
	<ca471dc2050509215823876c50@mail.gmail.com>
	<42818D07.8020303@canterbury.ac.nz>
	<5.1.1.6.0.20050511132607.01f66da8@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050511145227.01f6c4e8@mail.telecommunity.com>

At 12:44 PM 5/11/2005 -0600, Shane Hathaway wrote:
>Phillip J. Eby wrote:
> > FYI, there are still use cases for clearing the exception state in an
> > __exit__ method, that might justify allowing a true return from 
> __exit__ to
> > suppress the error.  e.g.:
>
>Maybe __exit__ could suppress exceptions using a new idiom:
>
>          def __exit__(self,*exc):
>              if exc and not last and issubclass(exc[0],self.type):
>                  # suppress the exception
>                  raise None
>
>This seems clearer than "return True".

Nice, although perhaps a little too cute.  But it's moot as Guido has 
vetoed the whole idea.


From rowen at cesmail.net  Wed May 11 21:43:09 2005
From: rowen at cesmail.net (Russell E. Owen)
Date: Wed, 11 May 2005 12:43:09 -0700
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
	<1f7befae05050411416c198c54@mail.gmail.com>
	<d5b7oj$uav$1@sea.gmane.org>
	<972ec5bd05051016285954fec@mail.gmail.com>
Message-ID: <rowen-718BBF.12430811052005@sea.gmane.org>

In article <972ec5bd05051016285954fec at mail.gmail.com>,
 Timothy Fitz <firemoth at gmail.com> wrote:

> > No, as except clauses can only occur before the finally clause, and 
> > execution
> > should not go backwards.
> 
> This restriction feels a bit arbitrary. I can guarantee someone is
> going to flatten this:
> 
> try:
>     try:
>         A
>     finally:
>         B
> except IOError:
>    C
> 
> A more flexible approach would be to allow finally at the beginning or
> ending of the try statement. A more flexible approach would be to
> allow both, or even finally clauses mixed in.
>
> To me, that's the ugly portion of this proposal, it's quite arbitrary.
> And the alternatives I posted have their own brands of ugly.

I strongly disagree. It makes sense to me, anyway, that "finally" can 
only be the final clause and that it always does exactly what it says: 
execute as the final bit of the try statement.

I think this would be a useful enhancement. It simplifies the published 
documentation a bit (no need to document try/except as a separate entity 
from try/finally) and I have plenty of cases where I'd like to take 
advantage of it.

> Concisely, this is an arbitrary shortcut for an idiom that already
> exists. It seems to me that this shortcut would be redundant if PEP
> 340 or something with similar functionality was accepted.

I do see your point here and I'll be curious to see how this shapes up 
(given the lengthy discussion going on about this tricky proposal).

But I also feel that the unification of try/except and try/finally is 
truly an improvement to the language. To me it feels like a 
simplification -- it removes an artificial, annoying and unnecessary 
split.

-- Russell


From steven.bethard at gmail.com  Wed May 11 22:05:30 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Wed, 11 May 2005 14:05:30 -0600
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <rowen-718BBF.12430811052005@sea.gmane.org>
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
	<1f7befae05050411416c198c54@mail.gmail.com>
	<d5b7oj$uav$1@sea.gmane.org>
	<972ec5bd05051016285954fec@mail.gmail.com>
	<rowen-718BBF.12430811052005@sea.gmane.org>
Message-ID: <d11dcfba05051113053cc67f91@mail.gmail.com>

On 5/11/05, Russell E. Owen <rowen at cesmail.net> wrote:
> I think this would be a useful enhancement. It simplifies the published
> documentation a bit (no need to document try/except as a separate entity
> from try/finally) and I have plenty of cases where I'd like to take
> advantage of it.

I have a feeling that it might actually be easier to continue to
document try/except and try/finally separately and then just give the
semantics of try/except/finally in terms of the other semantics.  Take
a look at the Java Language Specification[1] (pages 399-401) if you
want to see how nastly documenting try/except/finally can get.  And
they don't even have an else clause! ;-)

STeVe

[1]http://java.sun.com/docs/books/jls/download/langspec-3.0.pdf
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From ncoghlan at gmail.com  Wed May 11 22:55:48 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 12 May 2005 06:55:48 +1000
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <79990c6b05051105361e7a9ba0@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>	
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>	
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>	
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>	
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
Message-ID: <42827154.4040006@gmail.com>

Paul Moore wrote:
> At the very least, there should be a section in "rejected
> alternatives" explaining why it is not appropriate to force reraising
> of exceptions unless explicit action is taken. There could be good
> reasons (as I say, I haven't followed the discussion) but they should
> be recorded. And if there aren't any good reasons, this behaviour
> should probably be changed.

This is a good point - it's one of the things I changed in the simplification of 
the semantics between 1.3 and 1.4 (previously the behaviour was much as you 
describe).

The gist is that the alternative is to require an __exit__() method to raise 
TerminateBlock in order to suppress an exception. Then the call to __exit__() in 
the except clause needs to be wrapped in a try/except TerminateBlock/else, with 
the else reraising the exception, and the except clause suppressing it. The 
try/except around BLOCK1 has to have a clause added to reraise TerminateBlock so 
that it isn't inadvertently suppressed when it is raised by user code (although 
that may be a feature, not a bug! - I'll have to think about that one)

Basically, it's possible to set it up that way, but it was a little ugly and 
hard to explain. "It's suppressed if you don't reraise it" is very simple, but 
makes it easier (too easy?) to inadvertently suppress exceptions.

If people are comfortable with a little extra ugliness in the semantic details 
of 'do' statements in order to make it easier to write correct __exit__() 
methods, then I'm happy to change it back (I think I just talked myself into 
doing that, actually).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From ncoghlan at gmail.com  Wed May 11 23:04:35 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 12 May 2005 07:04:35 +1000
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <79990c6b05051105361e7a9ba0@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>	
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>	
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>	
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>	
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
Message-ID: <42827363.7070105@gmail.com>

Paul Moore wrote:
> PS Apologies if I missed the discussion of this in the PEP - as I say,
> I've only skimmed it so far.

With Guido's latest comments, it looks like this is going to be going into the 
"Open Issues" section - his current inclination is that do statements should 
only abstract finally clauses, not arbitrary exception handling. I believe he's 
misinterpreting the cause of the pushback against PEP 340 (I think it was the 
looping that was found objectionable, not the ability to suppress exceptions), 
but *shrug* :)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From gvanrossum at gmail.com  Wed May 11 23:11:44 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Wed, 11 May 2005 14:11:44 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <42827363.7070105@gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827363.7070105@gmail.com>
Message-ID: <ca471dc20505111411e63939f@mail.gmail.com>

[Nick Coghlan]
> With Guido's latest comments, it looks like this is going to be going into the
> "Open Issues" section - his current inclination is that do statements should
> only abstract finally clauses, not arbitrary exception handling. I believe he's
> misinterpreting the cause of the pushback against PEP 340 (I think it was the
> looping that was found objectionable, not the ability to suppress exceptions),
> but *shrug* :)

I realize that the pushback was against looping, but whereas in the
PEP 340 proposal general exception handling comes out naturally, it
feels as an ugly wart in the modified PEP 310 proposal.

Plus I think the use cases are much weaker (even PEP 340 doesn't have
many use cases for exception handling -- the only one is the
auto-retry example).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Wed May 11 23:30:27 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Wed, 11 May 2005 14:30:27 -0700
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <d11dcfba05051113053cc67f91@mail.gmail.com>
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
	<1f7befae05050411416c198c54@mail.gmail.com>
	<d5b7oj$uav$1@sea.gmane.org>
	<972ec5bd05051016285954fec@mail.gmail.com>
	<rowen-718BBF.12430811052005@sea.gmane.org>
	<d11dcfba05051113053cc67f91@mail.gmail.com>
Message-ID: <ca471dc2050511143045371d@mail.gmail.com>

[Steven Bethard]
> I have a feeling that it might actually be easier to continue to
> document try/except and try/finally separately and then just give the
> semantics of try/except/finally in terms of the other semantics.  Take
> a look at the Java Language Specification[1] (pages 399-401) if you
> want to see how nastly documenting try/except/finally can get.  And
> they don't even have an else clause! ;-)

Fine with me.

Can I go ahead and approve this now before someone proposes to add a
new keyword?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From steven.bethard at gmail.com  Wed May 11 23:44:40 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Wed, 11 May 2005 15:44:40 -0600
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <42827154.4040006@gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
Message-ID: <d11dcfba05051114441404ef0a@mail.gmail.com>

On 5/11/05, Nick Coghlan <ncoghlan at gmail.com> wrote:
> The gist is that the alternative is to require an __exit__() method to raise
> TerminateBlock in order to suppress an exception.

So I didn't see any examples that really needed TerminateBlock to
suppress an exception.  If the semantics of a do-statement was simply:

    stmt = EXPR1
    try:
        stmt_enter = stmt.__enter__
        stmt_exit = stmt.__exit__
    except AttributeError:
        raise TypeError("User defined statement template required")
    
    VAR1 = stmt_enter() # Omit 'VAR1 =' if no 'as' clause
    exc = ()
    try:
        try:
            BLOCK1
        except:
            exc = sys.exc_info()
    finally:
        stmt_exit(*exc)

would this make any of the examples impossible to write?  All you have
to do to suppress an exception is to not reraise it in __exit__. 
These semantics would make a normally completed BLOCK1 look like a
BLOCK1 exited through return, break or continue, but do we have any
use cases that actually need this distinction?  I couldn't see any,
but I've been reading *way* too many PEP 310/340 posts so probably my
eyes are just tired. ;-)

If you want the default to be that the exception gets re-raised
(instead of being suppressed as it is above), I think you could just
change the finally block to something like:

    finally:
        if stmt_exit(*exc):
           raise exc[0], exc[1], exc[2] 

That would mean that if any nonzero object was returned from __exit__,
the exception would be reraised.

But, like I say, I've been reading this thread way too much, so I'm
probably just missing the TerminateBlock use cases. =)

STeVe
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From steven.bethard at gmail.com  Wed May 11 23:47:58 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Wed, 11 May 2005 15:47:58 -0600
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <d11dcfba05051114441404ef0a@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
Message-ID: <d11dcfba050511144750e36a5d@mail.gmail.com>

On 5/11/05, Steven Bethard <steven.bethard at gmail.com> wrote:
> If you want the default to be that the exception gets re-raised
> (instead of being suppressed as it is above), I think you could just
> change the finally block to something like:
> 
>     finally:
>         if stmt_exit(*exc):
>            raise exc[0], exc[1], exc[2]
> 
> That would mean that if any nonzero object was returned from __exit__,
> the exception would be reraised.

Oops.  This should have been:

    finally:
       if not stmt_exit(*exc):
          raise exc[0], exc[1], exc[2]

This would mean that if a function returned None (or any other "False"
object) the exception would be reraised.  Suppressing the reraising of
the exception would require returning a nonzero object.

STeVe
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From gvanrossum at gmail.com  Thu May 12 00:03:27 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Wed, 11 May 2005 15:03:27 -0700
Subject: [Python-Dev] Breaking off Enhanced Iterators PEP from PEP 340
In-Reply-To: <d11dcfba050506111622ea81e3@mail.gmail.com>
References: <ca471dc2050506104520ed25bd@mail.gmail.com>
	<002001c55264$8065f560$11bd2c81@oemcomputer>
	<d11dcfba050506111622ea81e3@mail.gmail.com>
Message-ID: <ca471dc2050511150318ea24a1@mail.gmail.com>

[Steven Bethard]
> Well, busy-work or not, I took the 20 minutes to split them up, so I
> figured I might as well make them available.  It was actually really
> easy to split them apart, and I think they both read better this way,
> but I'm not sure my opinion counts for much here anyway. ;-)  (The
> Enhanced Iterators PEP is first, the remainder of PEP 340 follows it.)

Thanks Steven! I've finally found the few minutes it takes to check in
your changes.

All: the proposal of the extended continue statement, which passes a
value to yield, is now separated off into PEP 342, Enhanced Iterators.
(I noticed I snuck in another syntactic change, making yield without
an expression legal. I'm updating PEP 342 now to add that explicitly.)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Thu May 12 05:28:37 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Wed, 11 May 2005 20:28:37 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <d11dcfba05051114441404ef0a@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
Message-ID: <ca471dc20505112028b263df6@mail.gmail.com>

[Steven Bethard]
> So I didn't see any examples that really needed TerminateBlock to
> suppress an exception.  If the semantics of a do-statement was simply:
> 
>     stmt = EXPR1
>     try:
>         stmt_enter = stmt.__enter__
>         stmt_exit = stmt.__exit__
>     except AttributeError:
>         raise TypeError("User defined statement template required")
> 
>     VAR1 = stmt_enter() # Omit 'VAR1 =' if no 'as' clause
>     exc = ()
>     try:
>         try:
>             BLOCK1
>         except:
>             exc = sys.exc_info()
>     finally:
>         stmt_exit(*exc)
> 
> would this make any of the examples impossible to write?  All you have
> to do to suppress an exception is to not reraise it in __exit__.

But this use case would contain a trap for the unwary user who is
writing an __exit__ method -- they have to remember to re-raise an
exception if it was passed in, but that's easy to forget (and slightly
tricky since you have to check the arg count or whether the first
argument is not None).

Going for all-out simplicity, I would like to be able to write these examples:

class locking:
    def __init__(self, lock): self.lock = lock
    def __enter__(self): self.lock.acquire()
    def __exit__(self, *args): self.lock.release()

class opening:
    def __init__(self, filename): self.filename = filename
    def __enter__(self): self.f = open(self.filename); return self.f
    def __exit__(self, *args): self.f.close()\

And do EXPR as VAR: BLOCK would mentally be translated into

itr = EXPR
VAR = itr.__enter__()
try: BLOCK
finally: itr.__exit__(*sys.exc_info()) # Except sys.exc_info() isn't
defined by finally

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From steven.bethard at gmail.com  Thu May 12 07:09:35 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Wed, 11 May 2005 23:09:35 -0600
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc20505112028b263df6@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com>
Message-ID: <d11dcfba05051122096fac72be@mail.gmail.com>

[Guido]
> Going for all-out simplicity, I would like to be able to write these examples:
> 
> class locking:
>     def __init__(self, lock): self.lock = lock
>     def __enter__(self): self.lock.acquire()
>     def __exit__(self, *args): self.lock.release()
> 
> class opening:
>     def __init__(self, filename): self.filename = filename
>     def __enter__(self): self.f = open(self.filename); return self.f
>     def __exit__(self, *args): self.f.close()\
> 
> And do EXPR as VAR: BLOCK would mentally be translated into
> 
> itr = EXPR
> VAR = itr.__enter__()
> try: BLOCK
> finally: itr.__exit__(*sys.exc_info()) # Except sys.exc_info() isn't
> defined by finally

Yeah, that's what I wanted to do too.  That should be about what my
second suggestion did.  Slightly updated, it looks like:

   stmt = EXPR1
   VAR1 = stmt.__enter__()
   exc = () # or (None, None, None) if you prefer
   try:
       try:
           BLOCK1
       except:
           exc = sys.exc_info()
   finally:
       if stmt.__exit__(*exc) is not None:
           raise exc[0], exc[1], exc[2]

The only difference should be that with the above semantics if you
return a (non-None) value from __exit__, the exception will be
suppressed (that is, it will not be reraised).  Means that if you want
to suppress an exception, you have to add a return statement (but if
you want exceptions to be reraised, you don't have to do anything.)

I suggest this only because there were a few suggested use-cases for
suppressing exceptions.  OTOH, almost all my uses are just
try/finally's so I'm certainly not going to cry if that last finally
instead looks like:

   finally:
       stmt.__exit__(*exc)
       raise exc[0], exc[1], exc[2]

=)

STeVe
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From eric.nieuwland at xs4all.nl  Thu May 12 07:12:50 2005
From: eric.nieuwland at xs4all.nl (Eric Nieuwland)
Date: Thu, 12 May 2005 07:12:50 +0200
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc20505112028b263df6@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com>
Message-ID: <285b213d14928985b63f7a8d05b808b6@xs4all.nl>

Guido van Rossum wrote:
> class locking:
>     def __init__(self, lock): self.lock = lock
>     def __enter__(self): self.lock.acquire()
>     def __exit__(self, *args): self.lock.release()
>
> class opening:
>     def __init__(self, filename): self.filename = filename
>     def __enter__(self): self.f = open(self.filename); return self.f
>     def __exit__(self, *args): self.f.close()\
>
> And do EXPR as VAR: BLOCK would mentally be translated into
>
> itr = EXPR
> VAR = itr.__enter__()
> try: BLOCK
> finally: itr.__exit__(*sys.exc_info()) # Except sys.exc_info() isn't
> defined by finally

In this example locking's __enter__ does not return anything.
Would
do EXPR:
	BLOCK
also be legal syntax?


-eric


From mwh at python.net  Thu May 12 09:29:18 2005
From: mwh at python.net (Michael Hudson)
Date: Thu, 12 May 2005 08:29:18 +0100
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <ca471dc2050511143045371d@mail.gmail.com> (Guido van Rossum's
	message of "Wed, 11 May 2005 14:30:27 -0700")
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
	<1f7befae05050411416c198c54@mail.gmail.com>
	<d5b7oj$uav$1@sea.gmane.org>
	<972ec5bd05051016285954fec@mail.gmail.com>
	<rowen-718BBF.12430811052005@sea.gmane.org>
	<d11dcfba05051113053cc67f91@mail.gmail.com>
	<ca471dc2050511143045371d@mail.gmail.com>
Message-ID: <2mbr7g7vht.fsf@starship.python.net>

Guido van Rossum <gvanrossum at gmail.com> writes:

> [Steven Bethard]
>> I have a feeling that it might actually be easier to continue to
>> document try/except and try/finally separately and then just give the
>> semantics of try/except/finally in terms of the other semantics.  Take
>> a look at the Java Language Specification[1] (pages 399-401) if you
>> want to see how nastly documenting try/except/finally can get.  And
>> they don't even have an else clause! ;-)
>
> Fine with me.
>
> Can I go ahead and approve this now 

While I see the cost of this PEP being pretty small, I see the benefit
the same way too.

> before someone proposes to add a new keyword?

Heh.

Cheers,
mwh

-- 
  If i don't understand lisp, it would be wise to not bray about
  how lisp is stupid or otherwise criticize, because my stupidity
  would be archived and open for all in the know to see.
                                                -- Xah, comp.lang.lisp

From p.f.moore at gmail.com  Thu May 12 11:01:07 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Thu, 12 May 2005 10:01:07 +0100
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc20505111411e63939f@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827363.7070105@gmail.com>
	<ca471dc20505111411e63939f@mail.gmail.com>
Message-ID: <79990c6b050512020158646446@mail.gmail.com>

On 5/11/05, Guido van Rossum <gvanrossum at gmail.com> wrote:
> I realize that the pushback was against looping, but whereas in the
> PEP 340 proposal general exception handling comes out naturally, it
> feels as an ugly wart in the modified PEP 310 proposal.
> 
> Plus I think the use cases are much weaker (even PEP 340 doesn't have
> many use cases for exception handling -- the only one is the
> auto-retry example).

Accepted, but I still feel that the templates should explicitly
include the try...finally, rather than simply having the yield mark
the split. The examples seem more readable that way. Explicit is
better than implicit, and all that...

Paul.

From python-dev at zesty.ca  Thu May 12 11:04:14 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Thu, 12 May 2005 04:04:14 -0500 (CDT)
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc20505112028b263df6@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com>
Message-ID: <Pine.LNX.4.58.0505120354430.14555@server1.LFW.org>

On Wed, 11 May 2005, Guido van Rossum wrote:
> [Steven Bethard]
> >     exc = ()
> >     try:
> >         try:
> >             BLOCK1
> >         except:
> >             exc = sys.exc_info()
> >     finally:
> >         stmt_exit(*exc)
> >
> > would this make any of the examples impossible to write?  All you have
> > to do to suppress an exception is to not reraise it in __exit__.
>
> But this use case would contain a trap for the unwary user who is
> writing an __exit__ method -- they have to remember to re-raise an
> exception if it was passed in, but that's easy to forget (and slightly
> tricky since you have to check the arg count or whether the first
> argument is not None).

Then wouldn't it be simplest to separate normal exit from exceptional
exit?  That is, split __exit__ into __except__ and __finally__.  If
__except__ is defined, then it handles the exception, otherwise the
exception is raised normally.

> class locking:
>     def __init__(self, lock): self.lock = lock
>     def __enter__(self): self.lock.acquire()
>     def __exit__(self, *args): self.lock.release()

Having __exit__ take varargs is a signal to me that it mashes together
what really are two different methods.


-- ?!ng

From ncoghlan at gmail.com  Thu May 12 12:31:44 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 12 May 2005 20:31:44 +1000
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <d11dcfba05051114441404ef0a@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>	<4281E14C.1070405@gmail.com>	<79990c6b05051105361e7a9ba0@mail.gmail.com>	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
Message-ID: <42833090.3060905@gmail.com>

Steven Bethard wrote:
> On 5/11/05, Nick Coghlan <ncoghlan at gmail.com> wrote:
> 
>>The gist is that the alternative is to require an __exit__() method to raise
>>TerminateBlock in order to suppress an exception.
> 
> So I didn't see any examples that really needed TerminateBlock to
> suppress an exception.

Yeah, I figured out a tidier way to handle it after reading Phillip's message 
earlier today. My idea is similar to your second solution, but with an early 
exit via break, continue or return still indicated to the __exit__() method via 
TerminateBlock so that examples like transaction() continue to do the right thing:

     the_stmt = EXPR1
     stmt_enter = getattr(the_stmt, "__enter__", None)
     stmt_exit = getattr(the_stmt, "__exit__", None)
     if stmt_enter is None or stmt_exit is None:
         raise TypeError("User defined statement template required")

     terminate = True
     VAR1 = stmt_enter() # Omit 'VAR1 =' if no 'as' clause
     try:
         try:
             BLOCK1
         except TerminateBlock:
             raise # Disallow suppression of TerminateBlock
         except:
             terminate = False
             if not stmt_exit(*sys.exc_info()):
                 raise
         else:
             terminate = False
             stmt_exit()
     finally:
         if terminate:
             try:
                 stmt_exit(TerminateBlock, None, None)
             except TerminateBlock:
                 pass

Version 1.5 uses these updated semantics, and the suggested generator __exit__() 
method semantics are adjusted appropriately. I've also added a paragraph in Open 
Issues about removing the ability to suppress exceptions as Guido has suggested. 
However, I'm hoping his objections are based on the assorted horrible mechanisms 
I used in versions before this one - he is quite right that forcing every 
__exit__() method to reraise exceptions was a rather ugly wart.

The new version also fixes a typo in the auto_retry example that a couple of 
people pointed out, and adds a non-exception related example from Arnold deVos.

The URL is the same as before:
http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From michele.simionato at gmail.com  Thu May 12 13:23:11 2005
From: michele.simionato at gmail.com (Michele Simionato)
Date: Thu, 12 May 2005 07:23:11 -0400
Subject: [Python-Dev] a patch to inspect and a non-feature request
Message-ID: <4edc17eb05051204238677a29@mail.gmail.com>

Well, I logged into Sourceforge with the idea of filing my feature request
about copying functions, and then my eye went on my previous submissions.
It seems it takes some time to fix non-critical bugs, isn't it? ;)

Two years ago, I discovered a bug with pydoc for classes containing "super" 
objects:

>>> class C(object):
...    pass

>>> C.s = super(C)

>>> help(C) # aargh!!

I filed that bug 25 months ago and it is still there (actually Brett
Cannot fixed it but then somebody else broke his patch).
 
Clearly nobody uses this feature and the bug fixing is not at
all urgent still it disturbs me, so I have worked out
a patch. Actually, the problem is not in pydoc but in inspect,
that treats super objects as methods, whereas they should be treated
as data. Here is the patch:

$ diff -c /home/micheles/python/dist/src/Lib/inspect.py inspect.py
*** /home/micheles/python/dist/src/Lib/inspect.py       Thu May 12 13:05:10 2005
--- inspect.py  Thu May 12 13:06:55 2005
***************
*** 77,83 ****
              and not hasattr(object, "__set__") # else it's a data descriptor
              and not ismethod(object)           # mutual exclusion
              and not isfunction(object)
!             and not isclass(object))

  def isdatadescriptor(object):
      """Return true if the object is a data descriptor.
--- 77,84 ----
              and not hasattr(object, "__set__") # else it's a data descriptor
              and not ismethod(object)           # mutual exclusion
              and not isfunction(object)
!             and not isclass(object)
!             and not isinstance(object, super))

  def isdatadescriptor(object):
      """Return true if the object is a data descriptor.

It changes the code of ismethoddescriptor to make sure that super objects
are not treated as methods.

BTW, I have downloaded the CVS version of Python and run test_inspect
against the patch and it is working. However, introspection tools have
the tendency to be very fragile (especially with the rate of changes
in Python) and it is possible that this fix would break something else.

Let The Powers That Be to decide.

The test suite should be augmented with a test such

>>> inspect.ismethoddescriptor(C.s)
False

In my experience super is a huge can of worms and actually I have a non-feature 
request about the descriptor aspect of super: I would like super's
__get__ method
and the possibily to call super with just one argument to be removed
in Python 3000.
They are pretty much useless (yes I know of "autosuper") and error prone.

               Michele Simionato

From gvanrossum at gmail.com  Thu May 12 15:50:15 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 12 May 2005 06:50:15 -0700
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
In-Reply-To: <2mbr7g7vht.fsf@starship.python.net>
References: <d59vll$4qf$1@sea.gmane.org>
	<ca471dc205050410097355aa80@mail.gmail.com>
	<1f7befae05050411416c198c54@mail.gmail.com>
	<d5b7oj$uav$1@sea.gmane.org>
	<972ec5bd05051016285954fec@mail.gmail.com>
	<rowen-718BBF.12430811052005@sea.gmane.org>
	<d11dcfba05051113053cc67f91@mail.gmail.com>
	<ca471dc2050511143045371d@mail.gmail.com>
	<2mbr7g7vht.fsf@starship.python.net>
Message-ID: <ca471dc205051206505ff881ba@mail.gmail.com>

> > Can I go ahead and approve this now
> 
> While I see the cost of this PEP being pretty small, I see the benefit
> the same way too.

Sure. Let me approve it and we'll see if someone cares enough to implement it.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From steven.bethard at gmail.com  Thu May 12 15:52:10 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Thu, 12 May 2005 07:52:10 -0600
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <42833090.3060905@gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<42833090.3060905@gmail.com>
Message-ID: <d11dcfba050512065272ce7a14@mail.gmail.com>

On 5/12/05, Nick Coghlan <ncoghlan at gmail.com> wrote:
> Yeah, I figured out a tidier way to handle it after reading Phillip's message
> earlier today. My idea is similar to your second solution, but with an early
> exit via break, continue or return still indicated to the __exit__() method via
> TerminateBlock so that examples like transaction() continue to do the right thing:

Do they?  I don't write enough transactional code, but I would have
thought that break, continue or return would have been a normal,
expected exit from the do-statement and therefore should do a
db.commit(), not a db.rollback().  Do you think you could add an
example of how the transaction do-statement would be used in such a
way that these would be the desired semantics?

Thanks,

STeVe
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From steven.bethard at gmail.com  Thu May 12 16:08:20 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Thu, 12 May 2005 08:08:20 -0600
Subject: [Python-Dev] a patch to inspect and a non-feature request
In-Reply-To: <4edc17eb05051204238677a29@mail.gmail.com>
References: <4edc17eb05051204238677a29@mail.gmail.com>
Message-ID: <d11dcfba05051207084b3e9144@mail.gmail.com>

On 5/12/05, Michele Simionato <michele.simionato at gmail.com> wrote:
> In my experience super is a huge can of worms and actually I have a non-feature
> request about the descriptor aspect of super: I would like super's
> __get__ method
> and the possibily to call super with just one argument to be removed
> in Python 3000.

+1 while super doesn't work with "meta-attributes" and classmethods:

py> class B(object):
...     "The B type"
...     @classmethod
...     def m(cls):
...         print "B.m"
... 
py> class C(B):
...     @classmethod
...     def m(cls):
...         print "C.m"
...         cls._sup.m()
... 
py> C._sup = super(C)
py> super(C, C).__doc__
'The B type'
py> super(C, C).__name__
Traceback (most recent call last):
  File "<interactive input>", line 1, in ?
AttributeError: 'super' object has no attribute '__name__'
py> C().m()
C.m
Traceback (most recent call last):
  File "<interactive input>", line 1, in ?
  File "<interactive input>", line 5, in m
AttributeError: 'super' object has no attribute 'm'

STeVe
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From jimjjewett at gmail.com  Thu May 12 16:17:00 2005
From: jimjjewett at gmail.com (Jim Jewett)
Date: Thu, 12 May 2005 10:17:00 -0400
Subject: [Python-Dev]  Merging PEP 310 and PEP 340-redux?
Message-ID: <fb6fbf5605051207173ae13b0f@mail.gmail.com>

In  http://mail.python.org/pipermail/python-dev/2005-May/053652.html Nick wrote:

     terminate = True
     VAR1 = stmt_enter() # Omit 'VAR1 =' if no 'as' clause
     try:
         try:
             BLOCK1
         except TerminateBlock:
             raise # Disallow suppression of TerminateBlock
         except:
             terminate = False
             if not stmt_exit(*sys.exc_info()):
                 raise
         else:
             terminate = False
             stmt_exit()
     finally:
         if terminate:
             try:
                 stmt_exit(TerminateBlock, None, None)
             except TerminateBlock:
                 pass

This seems confusing to me, as if it were saying "OK, I don't want to 
finalize this, so I'll set terminate to false, but then finalize anyhow."

Would "terminated=False" (and getting set later to True) still meet
your requirements?  Or even "finalized=False"?  I realize that the
spelling of
finalise is a bugaboo, but presumably this is really a hidden variable,
instead of something that the user must type.  Or have I misunderstood
that as well?

-jJ

From gvanrossum at gmail.com  Thu May 12 16:50:29 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 12 May 2005 07:50:29 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <d11dcfba050512065272ce7a14@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<42833090.3060905@gmail.com>
	<d11dcfba050512065272ce7a14@mail.gmail.com>
Message-ID: <ca471dc205051207504facf994@mail.gmail.com>

A quick response to various small issues...

- Benji York proposes that file and lock objects (for example) could
  have suitable __enter__ and __exit__ methods (__enter__ would have
  to return self).  Right!

- Greg Ewing (I believe) wants 'do' instead of 'with' for the
  keyword.  I think I like 'with' better, especially combining it with
  Benji's proposal.  IMO this reads better with 'with' than with 'do':

    with open("/etc/passwd") as f:
        for line in f:
            ...

- Steve Bethard has this example:

    stmt = EXPR1
    VAR1 = stmt.__enter__()
    exc = () # or (None, None, None) if you prefer
    try:
	try:
	    BLOCK1
	except:
	    exc = sys.exc_info()
    finally:
	if stmt.__exit__(*exc) is not None:
	    raise exc[0], exc[1], exc[2]

  but he seems to forget that finally *always* re-raises the
  exception.  Anyway, I still don't care for the use case; rather than
  fixing the coding bug, your time would be better spent arguing why
  this functionality can't be missed.

- Eric Nieuwland asks if the VAR is still optional.  Yes, it is (this
  is implicit in the entire set of threads).

- Paul Moore wants the generator templates to explicitly contain
  try/finally (or try/except, depending on the use case).  That's much
  more work though (passing exceptions into a generator is a new
  feature) and is not necessary to get the "redux" version.

- Ka-ping Yee thinks we need separate entry points for the exceptional
  and the normal termination case.  I disagree; this would end up in
  unnecessary duplication of code (or boilerplate to equate the two
  methods) in most cases.  The whole *point* is that finally gets to
  do its clean-up act regardless of whether an exception is being
  processed or not.  The varargs signature to __exit__ was just me
  being lazy instead of typing

    def __exit__(self, t=None, v=None, tb=None): ...

- Nick is still peddling his much more complicated variant.  I
  recommend that he focuses on arguing use cases rather than semantic
  subtleties, or else it won't get any traction (at least not with me
  :-).

--Guido van Rossum (home page: http://www.python.org/~guido/)

From michele.simionato at gmail.com  Thu May 12 16:58:57 2005
From: michele.simionato at gmail.com (Michele Simionato)
Date: Thu, 12 May 2005 10:58:57 -0400
Subject: [Python-Dev] a patch to inspect and a non-feature request
In-Reply-To: <d11dcfba05051207084b3e9144@mail.gmail.com>
References: <4edc17eb05051204238677a29@mail.gmail.com>
	<d11dcfba05051207084b3e9144@mail.gmail.com>
Message-ID: <4edc17eb05051207584ffb937@mail.gmail.com>

On 5/12/05, Steven Bethard <steven.bethard at gmail.com> wrote:
>super doesn't work with "meta-attributes" and classmethods:
> 
> py> super(C, C).__name__
> Traceback (most recent call last):
>   File "<interactive input>", line 1, in ?
> AttributeError: 'super' object has no attribute '__name__'

Actually this is the Right Thing to do for super. It is something
to be aware of, not something to change. Since __name__ is
a descriptor defined in the type metaclass and not an attribute
defined in the base class, super correctly does not retrieve it.
It is enough to add some documentation about "super" caveats
and nonobvious points.
What I really dislike is super called with only one argument since
it has many unpleasant surprises and not real advantages :-(
               
Michele Simionato

From benji at zope.com  Thu May 12 06:01:22 2005
From: benji at zope.com (Benji York)
Date: Thu, 12 May 2005 00:01:22 -0400
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc20505112028b263df6@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>	<4281E14C.1070405@gmail.com>	<79990c6b05051105361e7a9ba0@mail.gmail.com>	<42827154.4040006@gmail.com>	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com>
Message-ID: <4282D512.7060602@zope.com>

I think this is my first post to python-dev, so a mini-intro: I've 
been lurking here for about 5 years, "professional" user a bit longer, 
now working at Zope Corp.

Guido van Rossum wrote:
> Going for all-out simplicity, I would like to be able to write these examples:
> 
> class locking:
>     def __init__(self, lock): self.lock = lock
>     def __enter__(self): self.lock.acquire()
>     def __exit__(self, *args): self.lock.release()
> 
> class opening:
>     def __init__(self, filename): self.filename = filename
>     def __enter__(self): self.f = open(self.filename); return self.f
>     def __exit__(self, *args): self.f.close()

I've read the entire discussion, but may have missed this point, so, 
correct me if I'm wrong. Wouldn't these semantics let "normal" objects 
be used in a do.  For example, if the file object had these methods:

def __enter__(self): return self
def __exit__(self, *args): self.close()

you could write

do file('whatever) as f:
     lines = f.readlines()

Or a lock:

def __enter__(self): self.aquire(); return self
def __exit__(self, *args): self.release()

do my_lock:
     a()
     b()
     c()
-- 
Benji York
Sr. Software Engineer
Zope Corporation

From steven.bethard at gmail.com  Thu May 12 17:30:54 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Thu, 12 May 2005 09:30:54 -0600
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc205051207504facf994@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<42833090.3060905@gmail.com>
	<d11dcfba050512065272ce7a14@mail.gmail.com>
	<ca471dc205051207504facf994@mail.gmail.com>
Message-ID: <d11dcfba05051208306427d8f7@mail.gmail.com>

On 5/12/05, Guido van Rossum <gvanrossum at gmail.com> wrote:
> - Steve Bethard has this example:
> 
>     stmt = EXPR1
>     VAR1 = stmt.__enter__()
>     exc = () # or (None, None, None) if you prefer
>     try:
>         try:
>             BLOCK1
>         except:
>             exc = sys.exc_info()
>     finally:
>         if stmt.__exit__(*exc) is not None:
>             raise exc[0], exc[1], exc[2]
> 
>   but he seems to forget that finally *always* re-raises the
>   exception.

Not if except catches it:

py> try:
...     try:
...         raise Exception
...     except:
...         exc = sys.exc_info()
... finally:
...     pass
...    
py>

As I understand it, finally only re-raises the exception if it wasn't
already caught.  And I have to use an except block above to catch it
so that sys.exc_info() returns something other than (None, None,
None).

>   Anyway, I still don't care for the use case; rather than
>   fixing the coding bug, your time would be better spent arguing why
>   this functionality can't be missed.

Sorry, I'm unclear.  Do you not want sys.exc_info() passed to
__exit__, or do you not want the with/do-statement to be allowed to
suppress exceptions?  Or both?

My feeling is that the mechanism for suppressing exceptions is
confusing, and it would probably be better to always reraise the
exception. (I included it mainly to provide a middle ground between
Guido's and Nick's proposals.)  That is, I prefer something like:

   stmt = EXPR1
   VAR1 = stmt.__enter__()
   exc = () # or (None, None, None) if you prefer
   try:
       try:
           BLOCK1
       except:
           exc = sys.exc_info()
   finally:
       stmt.__exit__(*exc)
       raise exc[0], exc[1], exc[2]

which should really read the same as Guido's suggestion:

   stmt = EXPR1
   VAR1 = stmt.__enter__()
   try:
       BLOCK1
   finally:
       stmt.__exit__(*sys.exc_info())

except that since sys.exc_info() returns (None, None, None) when there
wasn't an except block, this won't actually work.

If we don't provide some flag that an exception occurred, the
transaction example doesn't work.  My feeling is that if we're going
to provide any flag, we might as well provide the entire
sys.exc_info().

STeVe
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From michele.simionato at gmail.com  Thu May 12 17:34:57 2005
From: michele.simionato at gmail.com (Michele Simionato)
Date: Thu, 12 May 2005 11:34:57 -0400
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <4282D512.7060602@zope.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
Message-ID: <4edc17eb05051208341b6bf751@mail.gmail.com>

On 5/12/05, Benji York <benji at zope.com> wrote:
> if the file object had these methods:
> 
> def __enter__(self): return self
> def __exit__(self, *args): self.close()
> 
> you could write
> 
> do file('whatever) as f:
>      lines = f.readlines()
> 
> Or a lock:
> 
> def __enter__(self): self.aquire(); return self
> def __exit__(self, *args): self.release()
> 
> do my_lock:
>      a()
>      b()
>      c()

Ah, finally a proposal that I can understand! 
But maybe the keyword should be "let":

let lock:
   do_something

let open("myfile") as f:
    for line in f: do_something(line)

or even, without need of "as":

let f=file("myfile") :
    for line in f: do_something(line)

which I actually like more

             Michele Simionato

From pje at telecommunity.com  Thu May 12 17:56:55 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Thu, 12 May 2005 11:56:55 -0400
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc205051207504facf994@mail.gmail.com>
References: <d11dcfba050512065272ce7a14@mail.gmail.com>
	<ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<42833090.3060905@gmail.com>
	<d11dcfba050512065272ce7a14@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050512115516.01f66db8@mail.telecommunity.com>

At 07:50 AM 5/12/2005 -0700, Guido van Rossum wrote:
>- Paul Moore wants the generator templates to explicitly contain
>   try/finally (or try/except, depending on the use case).  That's much
>   more work though (passing exceptions into a generator is a new
>   feature) and is not necessary to get the "redux" version.

Uh oh.  Speaking on behalf of the "coroutine-y people" :), does this mean 
that we're not going to get the ability to pass exceptions into generators?


From gvanrossum at gmail.com  Thu May 12 18:21:04 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 12 May 2005 09:21:04 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <5.1.1.6.0.20050512115516.01f66db8@mail.telecommunity.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<42833090.3060905@gmail.com>
	<d11dcfba050512065272ce7a14@mail.gmail.com>
	<ca471dc205051207504facf994@mail.gmail.com>
	<5.1.1.6.0.20050512115516.01f66db8@mail.telecommunity.com>
Message-ID: <ca471dc205051209211a57ccbd@mail.gmail.com>

[Guido van Rossum]
> >- Paul Moore wants the generator templates to explicitly contain
> >   try/finally (or try/except, depending on the use case).  That's much
> >   more work though (passing exceptions into a generator is a new
> >   feature) and is not necessary to get the "redux" version.

[Phillip J. Eby]
> Uh oh.  Speaking on behalf of the "coroutine-y people" :), does this mean
> that we're not going to get the ability to pass exceptions into generators?

That would be a separate proposal (PEP 288 or PEP 325).

The do/with statement with its __enter__ and __exit__ APIs is entirely
independent from generators. Having shown a few non-generator do/with
wrappers I'm not so sure there's a lot of need for generators here;
especially since generators will always have that
round-peg-square-hole feeling when they're used in a non-looping
context. But I still want to leave the door open for using generators
as do/with-templates, hence my modification of PEP 310 (otherwise we
could just accept PEP 310 as is and be done with it).

For coroutines, see PEP 342.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From python-dev at zesty.ca  Thu May 12 21:02:29 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Thu, 12 May 2005 14:02:29 -0500 (CDT)
Subject: [Python-Dev] "with" use case: exception chaining
In-Reply-To: <ca471dc205051207504facf994@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<42833090.3060905@gmail.com>
	<d11dcfba050512065272ce7a14@mail.gmail.com>
	<ca471dc205051207504facf994@mail.gmail.com>
Message-ID: <Pine.LNX.4.58.0505121229590.14555@server1.LFW.org>

> - Ka-ping Yee thinks we need separate entry points for the exceptional
>   and the normal termination case.  I disagree; this would end up in
>   unnecessary duplication of code (or boilerplate to equate the two
>   methods) in most cases.  The whole *point* is that finally gets to
>   do its clean-up act regardless of whether an exception is being
>   processed or not.

Okay, let me back up for a second.  My suggestion responded to your
reply to Steve Bethard's example about exception handling.  The point
of the suggestion is that *if* we are going to let "with" do exception
handling, it should be done in a separate method.  I didn't mean to
imply that __finally__ should be skipped.

This brings us back to the question of whether "with" should be able
to handle exceptions.  On this, you wrote:

> For try/finally we have a large body of use cases that just scream for
> abstraction. I'm not convinced that we have the same for try/except.

So let's look at some use cases.  I've thought of two; the first one is
nice and simple, and the second one is messier so i'll discuss it in a
separate message.

Example 1: Exception Chaining.

As has been previously discussed, the information from an exception can
be lost when the handling of the exception runs into a problem.  It is
often helpful to preserve the original reason for the problem.

Suppose, by convention, that the "reason" attribute on exception objects
is designated for this purpose.  The assignment of this attribute can be
conveniently abstracted using a "with" clause as follows:

    try:
        # ... risky operation ...
    except:
        with reason(sys.exc_info()):
            # ... cleanup ...

The "with reason" construct would be implemented like this:

    class reason:
        def __init__(self, etype, evalue, etb):
            self.reason = etype, evalue, etb

        def __except__(self, etype, evalue, etb):
            evalue.reason = self.reason
            raise etype, evalue, etb

(Other possible names for "reason" might be "cause" or "context".)


-- ?!ng

From python-dev at zesty.ca  Thu May 12 22:00:24 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Thu, 12 May 2005 15:00:24 -0500 (CDT)
Subject: [Python-Dev] "with" use case: replacing a file
In-Reply-To: <ca471dc205051207504facf994@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<42833090.3060905@gmail.com>
	<d11dcfba050512065272ce7a14@mail.gmail.com>
	<ca471dc205051207504facf994@mail.gmail.com>
Message-ID: <Pine.LNX.4.58.0505121404100.14555@server1.LFW.org>

Here's another use case to think about.

Example 2: Replacing a File.

Suppose we want to reliably replace a file.  We require that either:

    (a) The file is completely replaced with the new contents;
 or (b) the filesystem is unchanged and a meaningful exception is thrown.

We'd like to be able to write this conveniently as:

    with replace(filename) as file:
        ...
        file.write(spam)
        ...
        file.write(eggs)
        ...

To make sure the file is never only partially written, we rely on the
filesystem to rename files atomically, so the basic steps (without
error handling) look like this:

    tempname = filename + '.tmp'
    file = open(tempname, 'w')
    ...
    file.write(spam)
    ...
    file.close()
    os.rename(tempname, filename)

We would like to make sure the temporary file is cleaned up and no
filehandles are left open.  Here's my analysis of all the execution
paths we need to cover:

    1. +open +write +close +rename
    2. +open +write +close -rename ?remove
    3. +open +write -close ?remove
    4. +open -write +close ?remove
    5. +open -write -close ?remove
    6. -open

(In this list, + means success, - means failure, ? means don't care.)

When i add error handling, i get this:

    tempname = filename + '.tmp'
    file = open(tempname, 'w') # okay to let exceptions escape
    problem = None
    try:
        try:
            ...
            file.write(spam)
            ...
        except:
            problem = sys.exc_info()
            raise problem
    finally:
        try:
            file.close()
        except Exception, exc:
            problem, problem.reason = exc, problem
        if not problem:
            try:
                os.rename(tempname, filename)
            except Exception, exc:
                problem, problem.reason = exc, problem
        if problem:
            try:
                os.remove(tempname)
            except Exception, exc:
                problem, problem.reason = exc, problem
                raise problem

In this case, the implementation of replace() doesn't require a
separate __except__ method:

class replace:
    def __init__(self, filename):
        self.filename = filename
        self.tempname = '%s.%d.%d' % (self.filename, os.getpid(), id(self))

    def __enter__(self):
        self.file = open(self.tempname, 'w')
        return self

    def write(self, data):
        self.file.write(data)

    def __exit__(self, *problem):
        try:
            self.file.close()
        except Exception, exc:
            problem, problem.reason = exc, problem
        if not problem: # commit
            try:
                os.rename(tempname, filename)
            except Exception, exc:
                problem, problem.reason = exc, problem
        if problem: # rollback
            try:
                os.remove(tempname)
            except Exception, exc:
                problem, problem.reason = exc, problem
                raise problem

This is all so intricate i'm not sure if i got it right.  Somebody
let me know if this looks right or not.  (In any case, i look forward
to the day when i can rely on someone else to get it right, and they
only have to write it once!)


-- ?!ng

From pje at telecommunity.com  Thu May 12 22:34:06 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Thu, 12 May 2005 16:34:06 -0400
Subject: [Python-Dev] "with" use case: replacing a file
In-Reply-To: <Pine.LNX.4.58.0505121404100.14555@server1.LFW.org>
References: <ca471dc205051207504facf994@mail.gmail.com>
	<ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<42833090.3060905@gmail.com>
	<d11dcfba050512065272ce7a14@mail.gmail.com>
	<ca471dc205051207504facf994@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050512162602.01f7c420@mail.telecommunity.com>

At 03:00 PM 5/12/2005 -0500, Ka-Ping Yee wrote:
>This is all so intricate i'm not sure if i got it right.  Somebody
>let me know if this looks right or not.  (In any case, i look forward
>to the day when i can rely on someone else to get it right, and they
>only have to write it once!)

It looks fine, but it's not a use case for suppressing exceptions, nor was 
the exception-chaining example.

Really, the only use case for suppressing exceptions is to, well, suppress 
exceptions that are being logged, shown to the user, sent via email, or 
just plain ignored.  Guido's point, I think, is that these use cases are 
rare enough (yet simple and similar enough) that they don't deserve support 
from the cleanup facility, and instead should use a try/except block.

After reviewing the cases in my own code where I might've used a 'with 
logged_exceptions()' or similar blocks, I think I now agree.  The 
difference between:

     try:
         BLOCK
     except:
         logger.exception(...)

and:

     with log_errors(logger):
         BLOCK

doesn't seem worth the effort, especially since this pattern just doesn't 
occur that often, compared to resource-using blocks.

What *your* use cases seem to illustrate, however, is that it's quite 
possible that an __exit__ might well need to contain complex error handling 
of its own, including the need to throw a different exception than the one 
that was passed in.


From gvanrossum at gmail.com  Thu May 12 23:16:17 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 12 May 2005 14:16:17 -0700
Subject: [Python-Dev] "with" use case: exception chaining
In-Reply-To: <Pine.LNX.4.58.0505121229590.14555@server1.LFW.org>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<42833090.3060905@gmail.com>
	<d11dcfba050512065272ce7a14@mail.gmail.com>
	<ca471dc205051207504facf994@mail.gmail.com>
	<Pine.LNX.4.58.0505121229590.14555@server1.LFW.org>
Message-ID: <ca471dc2050512141639663ea0@mail.gmail.com>

[Ka-Ping Yee]
> Example 1: Exception Chaining.
> 
> As has been previously discussed, the information from an exception can
> be lost when the handling of the exception runs into a problem.  It is
> often helpful to preserve the original reason for the problem.
[example deleted]

This problem is universal -- every except clause (in theory) can have
this problem. I'd much rather deal with this in a systematic way in
the Python VM's exception handling machinery. Modifying every
potentially affected except clause to use some additional boilerplate
doesn't strike me as a good solution.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From python-dev at zesty.ca  Thu May 12 23:59:14 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Thu, 12 May 2005 16:59:14 -0500 (CDT)
Subject: [Python-Dev] "with" use case: exception chaining
In-Reply-To: <ca471dc2050512141639663ea0@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<42833090.3060905@gmail.com>
	<d11dcfba050512065272ce7a14@mail.gmail.com>
	<ca471dc205051207504facf994@mail.gmail.com>
	<Pine.LNX.4.58.0505121229590.14555@server1.LFW.org>
	<ca471dc2050512141639663ea0@mail.gmail.com>
Message-ID: <Pine.LNX.4.58.0505121629360.14555@server1.LFW.org>

On Thu, 12 May 2005, Guido van Rossum wrote:
> [Ka-Ping Yee]
> > Example 1: Exception Chaining.
> >
> > As has been previously discussed, the information from an exception can
> > be lost when the handling of the exception runs into a problem.  It is
> > often helpful to preserve the original reason for the problem.
> [example deleted]
>
> This problem is universal -- every except clause (in theory) can have
> this problem. I'd much rather deal with this in a systematic way in
> the Python VM's exception handling machinery.

That's reasonable.  Unless another use case comes up, i withdraw my
suggestion for a separate __except__ method.

I hope the examples were interesting, anyhow.


-- ?!ng

From ncoghlan at gmail.com  Fri May 13 00:12:00 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 13 May 2005 08:12:00 +1000
Subject: [Python-Dev] "with" use case: replacing a file
In-Reply-To: <5.1.1.6.0.20050512162602.01f7c420@mail.telecommunity.com>
References: <ca471dc205051207504facf994@mail.gmail.com>	<ca471dc2050509215823876c50@mail.gmail.com>	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>	<4281E14C.1070405@gmail.com>	<79990c6b05051105361e7a9ba0@mail.gmail.com>	<42827154.4040006@gmail.com>	<d11dcfba05051114441404ef0a@mail.gmail.com>	<42833090.3060905@gmail.com>	<d11dcfba050512065272ce7a14@mail.gmail.com>	<ca471dc205051207504facf994@mail.gmail.com>
	<5.1.1.6.0.20050512162602.01f7c420@mail.telecommunity.com>
Message-ID: <4283D4B0.10306@gmail.com>

Phillip J. Eby wrote:
> Really, the only use case for suppressing exceptions is to, well, suppress 
> exceptions that are being logged, shown to the user, sent via email, or 
> just plain ignored.  Guido's point, I think, is that these use cases are 
> rare enough (yet simple and similar enough) that they don't deserve support 
> from the cleanup facility, and instead should use a try/except block.

Particularly since the *action* can be factored out into a do statement - it's 
only the *suppression* that has to be reproduced inline. That is:

   try:
       do standard_reaction():
           pass
   except MyException:
       pass

> After reviewing the cases in my own code where I might've used a 'with 
> logged_exceptions()' or similar blocks, I think I now agree.

I think I'm convinced, too. The only actual use case in the PEP is auto_retry, 
and that can be more obviously written with a try/except block inside the loop:

   for retry in reversed(xrange(num_attempts)):
       try:
          make_attempt()
          break
       except IOError:
          if not retry:
              raise

Not as pretty perhaps, but the control flow is far easier to see.

Steven has also convinced me that break, continue and return should look like 
normal exits rather than exceptions. This should bring my next PEP draft back to 
something resembling Guido's option 3 - the __exit__() method still gets passed 
the contents of sys.exc_info(), it just can't do anything about it other than 
raise a different exception.

Cheers,
Nick.

P.S. The points regarding non-local flow control in Joel Spolsky's latest Joel 
on Software article (especially the links at the end) may have had something to 
do with my change of heart. . .


-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From python-dev at zesty.ca  Fri May 13 00:14:24 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Thu, 12 May 2005 17:14:24 -0500 (CDT)
Subject: [Python-Dev] Tidier Exceptions
Message-ID: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>

It occurred to me as i was messing around with handling and re-raising
exceptions that tossing around these (type, value, traceback) triples
is irritating and error-prone.

How about just passing around a single value?  All we'd have to do is
put the traceback in value.traceback.

Implementation:

    - "raise Class" and "raise Class, string" automatically set
      the .traceback attribute on the new instance of Class.

    - "raise instance" automatically sets the .traceback attribute
      on the instance unless it already has one.

The behaviour of "except" and "sys.exc_*" could remain unchanged.
"raise t, v, tb" would eventually be deprecated in favour of "raise v".


-- ?!ng

From python-dev at zesty.ca  Fri May 13 00:32:50 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Thu, 12 May 2005 17:32:50 -0500 (CDT)
Subject: [Python-Dev] Chained Exceptions
Message-ID: <Pine.LNX.4.58.0505121715360.14555@server1.LFW.org>

Suppose exceptions have an optional "context" attribute, which is
set when the exception is raised in the context of handling another
exception.  Thus:

    def a():
        try:
            raise AError
        except:
            raise BError

yields an exception which is an instance of BError.  This instance
would have as its "context" attribute an instance of AError.

Or, in a more complex case:

    def compute():
        try:
            1/0
        except Exception, exc:
            log(exc)

    def log(exc):
        try:
            file = open('error.log')   # oops, forgot 'w'
            print >>file, exc
            file.close()
        except:
            display(exc)

    def display(exc):
        print 'Aaaack!', ex            # oops, misspelled 'exc'

Today, this just gives you a NameError about 'ex'.

With the suggested change, you would still get a NameError about 'ex';
its 'context' attribute would show that it occurred while handling an
IOError on error.log; and this IOError would have a 'context' attribute
containing the original ZeroDivisionError that started it all.

What do you think?


-- ?!ng

From gvanrossum at gmail.com  Fri May 13 01:09:21 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 12 May 2005 16:09:21 -0700
Subject: [Python-Dev] Tidier Exceptions
In-Reply-To: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>
Message-ID: <ca471dc2050512160977cc60f@mail.gmail.com>

[Ka-Ping Yee]
> It occurred to me as i was messing around with handling and re-raising
> exceptions that tossing around these (type, value, traceback) triples
> is irritating and error-prone.
> 
> How about just passing around a single value?  All we'd have to do is
> put the traceback in value.traceback.

I proposed the same thing a while back (during the early hours of
writing PEP 340).

It won't fly as long as we have string exceptions (since there's
nowhere to put the traceback) but once those are dead I like it a lot.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Fri May 13 01:12:35 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 12 May 2005 16:12:35 -0700
Subject: [Python-Dev] Chained Exceptions
In-Reply-To: <Pine.LNX.4.58.0505121715360.14555@server1.LFW.org>
References: <Pine.LNX.4.58.0505121715360.14555@server1.LFW.org>
Message-ID: <ca471dc205051216121093961b@mail.gmail.com>

[Ka-Ping Yee]
> Suppose exceptions have an optional "context" attribute, which is
> set when the exception is raised in the context of handling another
> exception.  Thus:
> 
>     def a():
>         try:
>             raise AError
>         except:
>             raise BError
> 
> yields an exception which is an instance of BError.  This instance
> would have as its "context" attribute an instance of AError.
[...]

I like the idea, but I'm not sure about the consequences, and I'm not
sure how it can be defined rigorously. Would it only happen when
something *in* an except clause raises an exception? Which piece of
code would be responsible for doing this?

Try to come up with a precise specification and we'll talk.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From bac at OCF.Berkeley.EDU  Fri May 13 01:42:46 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Thu, 12 May 2005 16:42:46 -0700
Subject: [Python-Dev] Tidier Exceptions
In-Reply-To: <ca471dc2050512160977cc60f@mail.gmail.com>
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>
	<ca471dc2050512160977cc60f@mail.gmail.com>
Message-ID: <4283E9F6.1020500@ocf.berkeley.edu>

Guido van Rossum wrote:
> [Ka-Ping Yee]
> 
>>It occurred to me as i was messing around with handling and re-raising
>>exceptions that tossing around these (type, value, traceback) triples
>>is irritating and error-prone.
>>
>>How about just passing around a single value?  All we'd have to do is
>>put the traceback in value.traceback.
> 
> 
> I proposed the same thing a while back (during the early hours of
> writing PEP 340).
> 
> It won't fly as long as we have string exceptions (since there's
> nowhere to put the traceback) but once those are dead I like it a lot.
> 

Seems like, especially if we require inheritance from a base exception class in
Python 3000, exceptions should have standard 'arg' and 'traceback' attributes
with a possible 'context' attribute (or always a 'context' attribute set to
None if not a chained exception).

I don't think there is other data normally associated with exceptions is there?

I really need to get off my ass one of these days and just write a PEP targeted
for Python 3000 with base inheritance, standard attributes (including exception
chains), reworking the built-in exception inheritance hierarchy, and whether
bare 'except' statements should go or only catch certain exceptions.  Could
probably stand to break it up until multiple PEPs, though.  =)

-Brett

From bac at OCF.Berkeley.EDU  Fri May 13 01:47:56 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Thu, 12 May 2005 16:47:56 -0700
Subject: [Python-Dev] Chained Exceptions
In-Reply-To: <ca471dc205051216121093961b@mail.gmail.com>
References: <Pine.LNX.4.58.0505121715360.14555@server1.LFW.org>
	<ca471dc205051216121093961b@mail.gmail.com>
Message-ID: <4283EB2C.7040602@ocf.berkeley.edu>

Guido van Rossum wrote:
> [Ka-Ping Yee]
> 
>>Suppose exceptions have an optional "context" attribute, which is
>>set when the exception is raised in the context of handling another
>>exception.  Thus:
>>
>>    def a():
>>        try:
>>            raise AError
>>        except:
>>            raise BError
>>
>>yields an exception which is an instance of BError.  This instance
>>would have as its "context" attribute an instance of AError.
> 
> [...]
> 
> I like the idea, but I'm not sure about the consequences, and I'm not
> sure how it can be defined rigorously. Would it only happen when
> something *in* an except clause raises an exception? Which piece of
> code would be responsible for doing this?
> 
> Try to come up with a precise specification and we'll talk.
> 

If a new exception is raised (e.g., not a bare 'raise') while a current
exception is active (e.g., sys.exc_info() would return something other than a
tuple of None), then the new exception is made the active exception and the now
old exception is assigned to the new exception's context attribute to be the
old exception.

-Brett

From python-dev at zesty.ca  Fri May 13 02:36:15 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Thu, 12 May 2005 19:36:15 -0500 (CDT)
Subject: [Python-Dev] Chained Exceptions
In-Reply-To: <4283EB2C.7040602@ocf.berkeley.edu>
References: <Pine.LNX.4.58.0505121715360.14555@server1.LFW.org>
	<ca471dc205051216121093961b@mail.gmail.com>
	<4283EB2C.7040602@ocf.berkeley.edu>
Message-ID: <Pine.LNX.4.58.0505121929360.14555@server1.LFW.org>

On Thu, 12 May 2005, Brett C. wrote:
> Guido van Rossum wrote:
> > Try to come up with a precise specification and we'll talk.
>
> If a new exception is raised (e.g., not a bare 'raise') while a current
> exception is active (e.g., sys.exc_info() would return something other
> than a tuple of None), then the new exception is made the active exception
> and the now old exception is assigned to the new exception's context
> attribute to be the old exception.

Yeah, i think that's basically all there is to it.  I'll go have a peek
at the interpreter to see if i'm forgetting something.


-- ?!ng

From gvanrossum at gmail.com  Fri May 13 02:43:48 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 12 May 2005 17:43:48 -0700
Subject: [Python-Dev] Chained Exceptions
In-Reply-To: <4283EB2C.7040602@ocf.berkeley.edu>
References: <Pine.LNX.4.58.0505121715360.14555@server1.LFW.org>
	<ca471dc205051216121093961b@mail.gmail.com>
	<4283EB2C.7040602@ocf.berkeley.edu>
Message-ID: <ca471dc2050512174318ba97a8@mail.gmail.com>

[Brett C.]
> If a new exception is raised (e.g., not a bare 'raise') while a current
> exception is active (e.g., sys.exc_info() would return something other than a
> tuple of None), then the new exception is made the active exception and the now
> old exception is assigned to the new exception's context attribute to be the
> old exception.

Define "raise". Does that involve a raise statement? What about 1/0?
What if you call a method that executes 1/0? What if that method
catches that exception? What about the StopIteration conceptually
raised by next() called by the for-loop implementation? (Often it
doesn't get instantiated at all when the next() method belongs to a
built-in iterator.)

I believe there are (at least) two use cases:

(1) I catch some low-level exception (e.g. socket.error) and turn it
into a high-level exception (e.g. an HTTPRequestFailed exception).

(2) I write some exception handling code and somehow a bug in the
handler (or an uncooperative environment, e.g. a full disk) causes the
exception handling code to trip over an exception.

I'm fairly certain (but not 100%) that Ping meant to include both use cases.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Fri May 13 02:46:26 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 12 May 2005 17:46:26 -0700
Subject: [Python-Dev] Tidier Exceptions
In-Reply-To: <4283E9F6.1020500@ocf.berkeley.edu>
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>
	<ca471dc2050512160977cc60f@mail.gmail.com>
	<4283E9F6.1020500@ocf.berkeley.edu>
Message-ID: <ca471dc2050512174655dfc7d6@mail.gmail.com>

[Brett C.]
> Seems like, especially if we require inheritance from a base exception class in
> Python 3000, exceptions should have standard 'arg' and 'traceback' attributes
> with a possible 'context' attribute (or always a 'context' attribute set to
> None if not a chained exception).
> 
> I don't think there is other data normally associated with exceptions is there?

I despise the "arg" argument -- I like Java's "message" concept better.

> I really need to get off my ass one of these days and just write a PEP targeted
> for Python 3000 with base inheritance, standard attributes (including exception
> chains), reworking the built-in exception inheritance hierarchy, and whether
> bare 'except' statements should go or only catch certain exceptions.  Could
> probably stand to break it up until multiple PEPs, though.  =)

+1.

I think these things are sufficiently closely related to keep them all
in one PEP.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From python-dev at zesty.ca  Fri May 13 02:59:17 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Thu, 12 May 2005 19:59:17 -0500 (CDT)
Subject: [Python-Dev] Chained Exceptions
In-Reply-To: <ca471dc2050512174318ba97a8@mail.gmail.com>
References: <Pine.LNX.4.58.0505121715360.14555@server1.LFW.org> 
	<ca471dc205051216121093961b@mail.gmail.com>
	<4283EB2C.7040602@ocf.berkeley.edu>
	<ca471dc2050512174318ba97a8@mail.gmail.com>
Message-ID: <Pine.LNX.4.58.0505121944440.14555@server1.LFW.org>

On Thu, 12 May 2005, Guido van Rossum wrote:
> Define "raise". Does that involve a raise statement?

Not necessarily; it could be a raise statement or an inadvertently
triggered exception, such as in the example code i posted.

> What about 1/0?

That counts.

> What if you call a method that executes 1/0?

That counts too.

> What if that method catches that exception?

Did you mean something like this?

    def handle():
        try:
            open('spamspamspam')
        except:
            catchit()
            # point A
            ...

    def catchit():
        try:
            1/0
        except:
            pass

Then there's no exception to propagate, so it doesn't matter.
Once we're get to point A, the division by zero is long forgotten.

> What about the StopIteration conceptually
> raised by next() called by the for-loop implementation?

It's "caught" by the for-loop, so to speak, so it never gets out.
Conceptually, the for-loop expands to:

    while 1:
        try:
            item = it.next()
        except StopIteration:
            break
        # body of loop goes here

The 'break' can't possibly cause an exception, so the StopIteration
exception isn't retained.

> I believe there are (at least) two use cases:
>
> (1) I catch some low-level exception (e.g. socket.error) and turn it
> into a high-level exception (e.g. an HTTPRequestFailed exception).
>
> (2) I write some exception handling code and somehow a bug in the
> handler (or an uncooperative environment, e.g. a full disk) causes the
> exception handling code to trip over an exception.
>
> I'm fairly certain (but not 100%) that Ping meant to include both use cases.

Yes, though i did not expect to provide any mechanism for distinguishing
the two cases.  Do you think such a mechanism would be necessary?


-- ?!ng

From python-dev at zesty.ca  Fri May 13 03:06:12 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Thu, 12 May 2005 20:06:12 -0500 (CDT)
Subject: [Python-Dev] Tidier Exceptions
In-Reply-To: <4283E9F6.1020500@ocf.berkeley.edu>
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>
	<ca471dc2050512160977cc60f@mail.gmail.com>
	<4283E9F6.1020500@ocf.berkeley.edu>
Message-ID: <Pine.LNX.4.58.0505122000530.14555@server1.LFW.org>

On Thu, 12 May 2005, Brett C. wrote:
> whether bare 'except' statements should go or only catch certain exceptions.

Maybe bare 'except' should be spelled 'except *'.

I don't think it can be removed altogether because sometimes you just
need to be able to do magic, but it can be made a little more explicit.

With the asterisk, it's greppable, and editors can find it or highlight
it.  I like the parallel to 'import *' (frowned upon, but sometimes
useful if you really know what you are doing).


-- ?!ng

From pje at telecommunity.com  Fri May 13 03:14:41 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Thu, 12 May 2005 21:14:41 -0400
Subject: [Python-Dev] Chained Exceptions
In-Reply-To: <Pine.LNX.4.58.0505121929360.14555@server1.LFW.org>
References: <4283EB2C.7040602@ocf.berkeley.edu>
	<Pine.LNX.4.58.0505121715360.14555@server1.LFW.org>
	<ca471dc205051216121093961b@mail.gmail.com>
	<4283EB2C.7040602@ocf.berkeley.edu>
Message-ID: <5.1.1.6.0.20050512210939.021e41a8@mail.telecommunity.com>

At 07:36 PM 5/12/2005 -0500, Ka-Ping Yee wrote:
>On Thu, 12 May 2005, Brett C. wrote:
> > Guido van Rossum wrote:
> > > Try to come up with a precise specification and we'll talk.
> >
> > If a new exception is raised (e.g., not a bare 'raise') while a current
> > exception is active (e.g., sys.exc_info() would return something other
> > than a tuple of None), then the new exception is made the active exception
> > and the now old exception is assigned to the new exception's context
> > attribute to be the old exception.
>
>Yeah, i think that's basically all there is to it.  I'll go have a peek
>at the interpreter to see if i'm forgetting something.

I think the main problem is going to be that (IIUC), Python doesn't "know" 
when you've  exited an 'except:' clause and are therefore no longer 
handling the exception.  sys.exc_info() still gives you the exception you 
just caught.  I think that a lot of the questions Guido brought up are 
directly related to this.

Also, what about code like this:

     try:
         doSomething()
     except SomeError:
         pass

     doSomethingElse()

Should exceptions raised by doSomethingElse()' be treated as having the 
SomeError as their context, if it was raised?

If I understand correctly, the interpreter cannot currently distinguish 
between this, and the case where an error is raised inside the 'except' clause.


From gvanrossum at gmail.com  Fri May 13 03:25:01 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 12 May 2005 18:25:01 -0700
Subject: [Python-Dev] Tidier Exceptions
In-Reply-To: <Pine.LNX.4.58.0505122000530.14555@server1.LFW.org>
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>
	<ca471dc2050512160977cc60f@mail.gmail.com>
	<4283E9F6.1020500@ocf.berkeley.edu>
	<Pine.LNX.4.58.0505122000530.14555@server1.LFW.org>
Message-ID: <ca471dc2050512182557c89ebd@mail.gmail.com>

[Ka-Ping Yee]
> Maybe bare 'except' should be spelled 'except *'.

-1.

> I don't think it can be removed altogether because sometimes you just
> need to be able to do magic, but it can be made a little more explicit.

Assuming a single root of the exception tree, you can spell it
explicitly as "except Exception" or perhaps (if that's not the root)
"except Raisable" (cf. Java's Throwable).

> With the asterisk, it's greppable, and editors can find it or highlight
> it.  I like the parallel to 'import *' (frowned upon, but sometimes
> useful if you really know what you are doing).

How is "except:" less greppable?

And is *args also frowned upon?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From python-dev at zesty.ca  Fri May 13 03:27:32 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Thu, 12 May 2005 20:27:32 -0500 (CDT)
Subject: [Python-Dev] Tidier Exceptions
In-Reply-To: <ca471dc2050512182557c89ebd@mail.gmail.com>
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>
	<ca471dc2050512160977cc60f@mail.gmail.com>
	<4283E9F6.1020500@ocf.berkeley.edu>
	<Pine.LNX.4.58.0505122000530.14555@server1.LFW.org>
	<ca471dc2050512182557c89ebd@mail.gmail.com>
Message-ID: <Pine.LNX.4.58.0505122027060.14555@server1.LFW.org>

On Thu, 12 May 2005, Guido van Rossum wrote:
> How is "except:" less greppable?

Duh.  I'm slow today.


-- ?!ng

From gvanrossum at gmail.com  Fri May 13 03:27:33 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 12 May 2005 18:27:33 -0700
Subject: [Python-Dev] Chained Exceptions
In-Reply-To: <5.1.1.6.0.20050512210939.021e41a8@mail.telecommunity.com>
References: <Pine.LNX.4.58.0505121715360.14555@server1.LFW.org>
	<ca471dc205051216121093961b@mail.gmail.com>
	<4283EB2C.7040602@ocf.berkeley.edu>
	<Pine.LNX.4.58.0505121929360.14555@server1.LFW.org>
	<5.1.1.6.0.20050512210939.021e41a8@mail.telecommunity.com>
Message-ID: <ca471dc205051218275e8394ed@mail.gmail.com>

[Phillip J. Eby]
> I think the main problem is going to be that (IIUC), Python doesn't "know"
> when you've  exited an 'except:' clause and are therefore no longer
> handling the exception.

But the compiler knows and could insert code to maintain this state.

> sys.exc_info() still gives you the exception you
> just caught.  I think that a lot of the questions Guido brought up are
> directly related to this.

Right.

> Also, what about code like this:
> 
>      try:
>          doSomething()
>      except SomeError:
>          pass
> 
>      doSomethingElse()
> 
> Should exceptions raised by doSomethingElse()' be treated as having the
> SomeError as their context, if it was raised?
> 
> If I understand correctly, the interpreter cannot currently distinguish
> between this, and the case where an error is raised inside the 'except' clause.

Indeed the interpreter currently doesn't distinguish between these,
but I think it ought to for the  purposes of this proposal.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Fri May 13 03:31:20 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 12 May 2005 18:31:20 -0700
Subject: [Python-Dev] Chained Exceptions
In-Reply-To: <Pine.LNX.4.58.0505121944440.14555@server1.LFW.org>
References: <Pine.LNX.4.58.0505121715360.14555@server1.LFW.org>
	<ca471dc205051216121093961b@mail.gmail.com>
	<4283EB2C.7040602@ocf.berkeley.edu>
	<ca471dc2050512174318ba97a8@mail.gmail.com>
	<Pine.LNX.4.58.0505121944440.14555@server1.LFW.org>
Message-ID: <ca471dc205051218316ca4c611@mail.gmail.com>

[Guido]
> > What if that method catches that exception?

[Ka-Ping Yee]
> Did you mean something like this?
> 
>     def handle():
>         try:
>             open('spamspamspam')
>         except:
>             catchit()
>             # point A
>             ...
> 
>     def catchit():
>         try:
>             1/0
>         except:
>             pass
> 
> Then there's no exception to propagate, so it doesn't matter.
> Once we're get to point A, the division by zero is long forgotten.

But at what point does the attaching happen? If I catch the
ZeroDivisionException inside catchit() and inspects its context
attribute, does it reference the IOError instance raised by
open('spamspamspam')? This could potentially cause a lot of extra
work: when an inner loop that raises and catches lots of exceptions is
invoked in the context of having caught an exception at some outer
level, the inner loop keeps attaching the outer exception to each
exception raised.

> Yes, though i did not expect to provide any mechanism for distinguishing
> the two cases.  Do you think such a mechanism would be necessary?

No, I was just trying to figure out what you meant when you said
"raise". It's clear now.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From shane at hathawaymix.org  Fri May 13 03:51:26 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Thu, 12 May 2005 19:51:26 -0600
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc20505112028b263df6@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>	<4281E14C.1070405@gmail.com>	<79990c6b05051105361e7a9ba0@mail.gmail.com>	<42827154.4040006@gmail.com>	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com>
Message-ID: <4284081E.6080100@hathawaymix.org>

Guido van Rossum wrote:
> Going for all-out simplicity, I would like to be able to write these examples:
> 
> class locking:
>     def __init__(self, lock): self.lock = lock
>     def __enter__(self): self.lock.acquire()
>     def __exit__(self, *args): self.lock.release()
> 
> class opening:
>     def __init__(self, filename): self.filename = filename
>     def __enter__(self): self.f = open(self.filename); return self.f
>     def __exit__(self, *args): self.f.close()\
> 
> And do EXPR as VAR: BLOCK would mentally be translated into
> 
> itr = EXPR
> VAR = itr.__enter__()
> try: BLOCK
> finally: itr.__exit__(*sys.exc_info()) # Except sys.exc_info() isn't
> defined by finally

If it's this simple, it should be possible to write something that
combines the acquisition of multiple resources in a single statement.
For example:

    with combining(opening(src_fn), opening(dst_fn, 'w')) as src, dst:
        copy(src, dst)

I think the following class would do it.

    class combining:
        def __init__(self, *resources):
            self.resources = resources
            self.entered = 0

        def __enter__(self):
            results = []
            try:
                for r in self.resources:
                    results.append(r.__enter__())
                    self.entered += 1
                return results
            except:
                # exit resources before re-raising the exception
                self.__exit__()
                raise

        def __exit__(self, *args):
            last_exc = None
            # exit only the resources successfully entered
            to_exit = self.resources[:self.entered]
            while to_exit:
                r = to_exit.pop()
                try:
                    r.__exit__(*args)
                except:
                    # re-raise the exception after exiting the others
                    last_exc = sys.exc_info()
            if last_exc is not None:
                raise last_exc[0], last_exc[1], last_exc[2]

Would that work?

Shane

From foom at fuhm.net  Fri May 13 03:56:21 2005
From: foom at fuhm.net (James Y Knight)
Date: Thu, 12 May 2005 21:56:21 -0400
Subject: [Python-Dev] Chained Exceptions
In-Reply-To: <Pine.LNX.4.58.0505121715360.14555@server1.LFW.org>
References: <Pine.LNX.4.58.0505121715360.14555@server1.LFW.org>
Message-ID: <783B96D4-35EE-4B26-9158-B80AA7006ECA@fuhm.net>


On May 12, 2005, at 6:32 PM, Ka-Ping Yee wrote:
> Suppose exceptions have an optional "context" attribute, which is
> set when the exception is raised in the context of handling another
> exception.  Thus:
>
>     def a():
>         try:
>             raise AError
>         except:
>             raise BError
>
> yields an exception which is an instance of BError.  This instance
> would have as its "context" attribute an instance of AError.
>

I think it's a bad idea to have this happen automatically. Many times  
if an exception is raised in the except clause, that doesn't  
necessarily imply it's related to the original exception. It just  
means there's a bug in the exception handler.

Take the divide by 0 example:
try:
   doABunchOfStuff()
except:
   1/0

If you're going to do anything useful with the chained exception  
information (such as have it printed by the default exception  
printer), it'd be best to not print a bunch of irrelevant backtraces  
for all exceptions up the stack. The reason that doABunchOfStuff  
failed is not important. What is important is only that you had a  
divide by 0.

In my mind it's much better to be explicit about your intentions, via  
something like:

try:
   raise AError
except Raiseable, e:
   raise BError(cause=e)

Of course you can already do similar with current python, it just  
can't be spelled as nicely, and the default traceback printer won't  
use the info:

try:
   raise AError
except:
   newException = BError()
   newException.cause=sys.exc_info()
   raise newException

James

From gvanrossum at gmail.com  Fri May 13 04:15:56 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 12 May 2005 19:15:56 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <4284081E.6080100@hathawaymix.org>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com>
	<4284081E.6080100@hathawaymix.org>
Message-ID: <ca471dc205051219151a3e3d85@mail.gmail.com>

[Shane Hathaway]
> If it's this simple, it should be possible to write something that
> combines the acquisition of multiple resources in a single statement.
> For example:
> 
>     with combining(opening(src_fn), opening(dst_fn, 'w')) as src, dst:
>         copy(src, dst)

Yeah (and I don't see anything wrong with your implementation of
combining either), but even if that existed I think I'd prefer to just
write

  with opening(src_fn) as src:
      with opening(dst_fn) as dst:
          copy(src, dst)

See Ma, no magic! :-)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From tdelaney at avaya.com  Fri May 13 04:17:55 2005
From: tdelaney at avaya.com (Delaney, Timothy C (Timothy))
Date: Fri, 13 May 2005 12:17:55 +1000
Subject: [Python-Dev] Chained Exceptions
Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE721280@au3010avexu1.global.avaya.com>

James Y Knight wrote:

> Of course you can already do similar with current python, it just
> can't be spelled as nicely, and the default traceback printer won't
> use the info:
> 
> try:
>    raise AError
> except:
>    newException = BError()
>    newException.cause=sys.exc_info()
>    raise newException

Well, one thing you can do is (somewhat evil ;)

::

    import sys

    try:
        raise AError, 'message'
    except:
        exc_type, exc_value, exc_traceback - sys.exc_info()
        raise BError, exc_value, exc_traceback

with the result:

    Traceback (most recent call last):
        File ...
            raise AError, 'message'
    BError: message

So you store the original exception as the argument to the new exception
(so it's accessible). This has the nice side effect that message is
displayed in the traceback - but the type has changed.

Whilst in the above example, it's particularly evil, in the case where
the original exception came from a function call and you want to
translate the type it works very nicely.

Tim Delaney

From gvanrossum at gmail.com  Fri May 13 04:18:53 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 12 May 2005 19:18:53 -0700
Subject: [Python-Dev] Chained Exceptions
In-Reply-To: <783B96D4-35EE-4B26-9158-B80AA7006ECA@fuhm.net>
References: <Pine.LNX.4.58.0505121715360.14555@server1.LFW.org>
	<783B96D4-35EE-4B26-9158-B80AA7006ECA@fuhm.net>
Message-ID: <ca471dc20505121918569d9d76@mail.gmail.com>

[James Y Knight ]
> I think it's a bad idea to have this happen automatically. Many times
> if an exception is raised in the except clause, that doesn't
> necessarily imply it's related to the original exception. It just
> means there's a bug in the exception handler.

Yeah, but especially in that case I think it would be nice if the
traceback printed by the system (if all this doesn't get caught at an
outer level) could show both the traceback from the handler and the
traceback that it was trying to handle -- I've had many occasions
where a trivial bug in the handler blew away the original traceback
which was shy enough to make repeating it a pain.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From michele.simionato at gmail.com  Fri May 13 05:58:04 2005
From: michele.simionato at gmail.com (Michele Simionato)
Date: Thu, 12 May 2005 23:58:04 -0400
Subject: [Python-Dev] the current behavior of try: ... finally:
Message-ID: <4edc17eb050512205832a1c15a@mail.gmail.com>

All this talk about try: ... finally: and exceptions reminded me of a curious
behavior I discovered a while back, i.e. that finally can swallow
your exceptions. This is a contrived example, but shows the point:

def divide1(n1, n2): 
    try:
        result = n1/n2
    finally:
        print "cleanup"
        result = "Infinity\n"
        return result # the exception is swallowed away

def divide2(n1, n2):
    try:
        result = n1/n2
    finally:
        print "cleanup"
        result = "Infinity\n"
    return result # the exception is NOT swallowed away

print divide1(10, 0) # no-exception
print divide2(10, 0) # error

If there is an indentation error in "divide2" and the return line is too
indented the exceptions get swallowed by the finally clause.

I am not sure if this is good or bad, but sure it surprised me that a
finally clause could hide my exception.

                     Michele Simionato

From greg.ewing at canterbury.ac.nz  Fri May 13 07:15:06 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 13 May 2005 17:15:06 +1200
Subject: [Python-Dev] the current behavior of try: ... finally:
In-Reply-To: <4edc17eb050512205832a1c15a@mail.gmail.com>
References: <4edc17eb050512205832a1c15a@mail.gmail.com>
Message-ID: <428437DA.5090800@canterbury.ac.nz>

Michele Simionato wrote:

> def divide1(n1, n2): 
>     try:
>         result = n1/n2
>     finally:
>         print "cleanup"
>         result = "Infinity\n"
>         return result # the exception is swallowed away

What would you prefer to have happen in this case?

Or do you think return (and break and continue) should
be disallowed in a finally?

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Fri May 13 07:15:08 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 13 May 2005 17:15:08 +1200
Subject: [Python-Dev] Tidier Exceptions
In-Reply-To: <ca471dc2050512160977cc60f@mail.gmail.com>
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>
	<ca471dc2050512160977cc60f@mail.gmail.com>
Message-ID: <428437DC.7030304@canterbury.ac.nz>

Guido van Rossum wrote:
> It won't fly as long as we have string exceptions (since there's
> nowhere to put the traceback) but once those are dead I like it a lot.

Are there plans as to when string exceptions will be
exterminated? Surely the only places they're used now
are in some very old library modules.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Fri May 13 07:15:10 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 13 May 2005 17:15:10 +1200
Subject: [Python-Dev] Tidier Exceptions
In-Reply-To: <4283E9F6.1020500@ocf.berkeley.edu>
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>
	<ca471dc2050512160977cc60f@mail.gmail.com>
	<4283E9F6.1020500@ocf.berkeley.edu>
Message-ID: <428437DE.60204@canterbury.ac.nz>

Brett C. wrote:

> Seems like, especially if we require inheritance from a base exception class in
> Python 3000, exceptions should have standard 'arg' and 'traceback' attributes
> with a possible 'context' attribute (or always a 'context' attribute set to
> None if not a chained exception).

Instead of an 'args' attribute, I'd suggest that
the constructor take keyword arguments and store
them in corresponding attributes. Then interested
parties could retrieve them by name instead of
having to remember their positions in the args
tuple of the exception class concerned.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Fri May 13 07:15:20 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 13 May 2005 17:15:20 +1200
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc205051207504facf994@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<42833090.3060905@gmail.com>
	<d11dcfba050512065272ce7a14@mail.gmail.com>
	<ca471dc205051207504facf994@mail.gmail.com>
Message-ID: <428437E8.4000701@canterbury.ac.nz>

Guido van Rossum wrote:

> - Greg Ewing (I believe) wants 'do' instead of 'with' for the
>   keyword.  I think I like 'with' better, especially combining it with
>   Benji's proposal.  IMO this reads better with 'with' than with 'do':
> 
>     with open("/etc/passwd") as f:
>         for line in f:
>             ...

I don't think I like the idea of giving the file object
itself __enter__ and __exit__ methods, because it doesn't
ensure that the opening and closing are done as a pair.
It would permit the following kind of mistake:

   f = open("somefile")
   with f:
     do_something()
   with f:
     do_something_else()

which our proposed construct, if it is any good, should
be able to prevent.

Also I don't at all agree that "with open(...)" reads
better; on the contrary, it seems ungrammatical.
Especially when compared with the very beautiful
"do opening(...)", which I would be disappointed
to give up.

I still also have reservations about "with" on the
grounds that we're making it mean something very
different to what it means in most other languages
that have a "with".

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Fri May 13 07:15:30 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 13 May 2005 17:15:30 +1200
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <4edc17eb05051208341b6bf751@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<77E275CA-80DD-4F4C-A676-2FB36AD8B0A3@aleax.it>
	<5.1.1.6.0.20050510114139.021a0138@mail.telecommunity.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
Message-ID: <428437F2.9020409@canterbury.ac.nz>

Michele Simionato wrote:

> let lock:
>    do_something
> 
> let open("myfile") as f:
>     for line in f: do_something(line)

This is getting even further into the realm
of gibberish to my ear.

> let f=file("myfile") :
>     for line in f: do_something(line)

To anyone with a Lisp or funcional background, that
looks like nothing more than a local variable
binding.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From michele.simionato at gmail.com  Fri May 13 07:22:06 2005
From: michele.simionato at gmail.com (Michele Simionato)
Date: Fri, 13 May 2005 01:22:06 -0400
Subject: [Python-Dev] the current behavior of try: ... finally:
In-Reply-To: <428437DA.5090800@canterbury.ac.nz>
References: <4edc17eb050512205832a1c15a@mail.gmail.com>
	<428437DA.5090800@canterbury.ac.nz>
Message-ID: <4edc17eb0505122222e6ec27c@mail.gmail.com>

On 5/13/05, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Michele Simionato wrote:
> 
> > def divide1(n1, n2):
> >     try:
> >         result = n1/n2
> >     finally:
> >         print "cleanup"
> >         result = "Infinity\n"
> >         return result # the exception is swallowed away
> 
> What would you prefer to have happen in this case?
> 
> Or do you think return (and break and continue) should
> be disallowed in a finally?
> 

Honestly, I don't know. This is why I ask here ;)

          Michele Simionato

From sakesun at boonthavorn.com  Fri May 13 08:00:07 2005
From: sakesun at boonthavorn.com (Sakesun Roykiattisak)
Date: Fri, 13 May 2005 13:00:07 +0700
Subject: [Python-Dev] the current behavior of try: ... finally:
In-Reply-To: <4edc17eb050512205832a1c15a@mail.gmail.com>
References: <4edc17eb050512205832a1c15a@mail.gmail.com>
Message-ID: <42844267.4070109@boonthavorn.com>


It did surprise me also.  Because I've come to Python from Delphi.  
There are no return statement in Delphi.
I also write some c++, the language has no finally-statement. This 
problem probably python exclusive.

I think it's not too difficult to get used to it. This behavior is fine 
for me.

From gvanrossum at gmail.com  Fri May 13 11:06:27 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 13 May 2005 02:06:27 -0700
Subject: [Python-Dev] Tidier Exceptions
In-Reply-To: <428437DC.7030304@canterbury.ac.nz>
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>
	<ca471dc2050512160977cc60f@mail.gmail.com>
	<428437DC.7030304@canterbury.ac.nz>
Message-ID: <ca471dc20505130206a7e8b7a@mail.gmail.com>

[Greg Ewing]
> Are there plans as to when string exceptions will be
> exterminated? Surely the only places they're used now
> are in some very old library modules.

No concrete plans; I was always planning to abandon them in 3.0 but
haven't felt the need to do it sooner. Last I looked Zope 2 still
depended on them (especially in the bowels of ZODB); maybe Tim Peters
knows if that's still the case.

If you want to do it sooner, maybe we need a small PEP with the
timeline (e.g. warn in Python 2.5, illegal in Python 2.6). Or perhaps
a patch on SF is all that's needed.

I expect it would be much more challenging to switch to the model
where all exceptions derive from a single (new-style) base class.

(And no, there are no plans to kill classic classes before 3.0 either.)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From ncoghlan at gmail.com  Fri May 13 11:23:52 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 13 May 2005 19:23:52 +1000
Subject: [Python-Dev] "with" use case: replacing a file
In-Reply-To: <ca471dc205051215326d1bda4c@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>	
	<4281E14C.1070405@gmail.com>	
	<79990c6b05051105361e7a9ba0@mail.gmail.com>	
	<42827154.4040006@gmail.com>	
	<d11dcfba05051114441404ef0a@mail.gmail.com>	
	<42833090.3060905@gmail.com>	
	<d11dcfba050512065272ce7a14@mail.gmail.com>	
	<ca471dc205051207504facf994@mail.gmail.com>	
	<5.1.1.6.0.20050512162602.01f7c420@mail.telecommunity.com>	
	<4283D4B0.10306@gmail.com>
	<ca471dc205051215326d1bda4c@mail.gmail.com>
Message-ID: <42847228.6070009@gmail.com>

Guido van Rossum wrote:
>>P.S. The points regarding non-local flow control in Joel Spolsky's latest Joel
>>on Software article (especially the links at the end) may have had something to
>>do with my change of heart. . .
> 
> 
> I'm a big fan of Joel. Care to share the specific URL for the article
> you're referring to?
> 

Sorry about that (I was in a hurry this morning). It was here:
http://www.joelonsoftware.com/articles/Wrong.html

The link of particular interest regarding exception handling was this one:
http://blogs.msdn.com/oldnewthing/archive/2005/01/14/352949.aspx

Some interesting points about some of the cons of exception based code (at least 
some of which relate to what we're dealing with in factoring out finally 
clauses, and C++ deals with via scope-based destruction).

Anyway, it made me realise that having any callables you invoke potentially 
raise exceptions is already tricky to deal with, and allowing a different call 
to potentially *suppress* those exceptions is a recipe for serious confusion.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From gvanrossum at gmail.com  Fri May 13 12:05:20 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 13 May 2005 03:05:20 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <428437F2.9020409@canterbury.ac.nz>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
Message-ID: <ca471dc20505130305233b330e@mail.gmail.com>

I just read Raymond Chen's rant against control flow macros:
http://blogs.msdn.com/oldnewthing/archive/2005/01/06/347666.aspx

I think this pretty much kills PEP 340, as well as Nick Coghlan's
alternative: both proposals let you write a "template" that can be
used to hide exception-catching code, which is a form of control flow
(and a pretty important one if you read Chen's rant against exceptions
referenced by the former, even if you don't agree with everything he
says in the latter).

Which leaves us, IMO, with the choice between PEP 310 and my own
"PEP-340-redux" proposal; these *only* introduce a finally-clause,
which does not affect the control flow. I'm not counting exceptions
that might happen in the finally-clause; exceptions can happen
anywhere anyway. But I am counting the *catching* of an exception as
control flow, since that means that code past BLOCK (in the same
function) is reachable even if BLOCK was not executed to completion;
and this is the argument against PEP 340 and against Nick's
alternative.

Let's compare and contrast the two remaining competitors:

PEP 310
=======

Syntax:
with EXPR [= VAR]:
    BLOCK

Translation:
[VAR =] abc = EXPR
if hasattr(abc, "__enter__"):
    abc.__enter__()
try:
    BLOCK
finally:
    abc.__exit__()

Pros:
- dead simple

Cons:
- can't use a decorated generator for EXPR

PEP 340 redux
=============

Syntax:
do EXPR [as VAR]:
    BLOCK

Translation:
abc = EXPR
[VAR =] abc.__enter__()
try:
    BLOCK
finally:
    abc.__exit__(*"sys.exc_info()") # Not exactly

Pros:
- can use a decorated generator as EXPR
- separation of EXPR and VAR (VAR gets what EXPR.__enter__() returns)

Cons:
- slightly less simple (__enter__ must return something for VAR;
  __exit__ takes optional args)

Everything else is equal or can be made equal. We can make them more
equal by treating the arguments passed to __exit__() as a separate
decision, and waffling about whether __enter__() should be optional (I
think it's a bad idea even for PEP 310; it *could* be made optional
for PEP 340 redux).

Let's also not quibble about the keyword used; again, that can be a
separate decision. Note that only PEP 310 can use the "VAR = EXPR"
syntax; PEP 340 redux *must* use "EXPR as VAR" since it doesn't assign
the value of EXPR to VAR; PEP 310 can be rewritten using this syntax
as well.

So then the all-important question I want to pose is: do we like the
idea of using a (degenerate, decorated) generator as a "template" for
the do-statement enough to accept the slightly increased complexity?
The added complexity is caused by the need to separate VAR from EXPR
so that a generator can be used. I personally like this separation; I
actually like that the "anonymous block controller" is logically
separate from the variable bound by the construct. From Greg Ewing's
response to the proposal to endow file objects with __enter__ and
__exit__ methods, I believe he thinks so too.

Straight up-or-down votes in the full senate are appreciated at this point.

On to the secondary questions:

- Today I like the 'do' keyword better; 'with' might confuse folks
coming from Pascal or VB

- I have a more elaborate proposal for __exit__'s arguments. Let the
translation be as follows:

abc = EXPR
[VAR =] abc.__enter__()
oke = False  # Pronounced "okay"
exc = ()
try:
    try:
        BLOCK
        oke = True
    except:
        exc = sys.exc_info()
        raise
finally:
    abc.__exit__(oke, *exc)

This means that __exit__ can be called with the following arguments:

abc.__exit__(True) - normal completion of BLOCK

abc.__exit__(False) - BLOCK was left by a non-local goto (break/continue/return)

abc.__exit__(False, t, v, tb) - BLOCK was left by an exception

(An alternative would be to always call it with 4 arguments, the last
three being None in the first two cases.)

If we adopt PEP 340 redux, it's up to the decorator for degenerate
generators to decide how to pass this information into the generator;
if we adopt PEP 342 ("continue EXPR") at the same time, we can let the
yield-expression return a 4-tuple (oke, t, v, tb). Most templates can
ignore this information (so they can just use a yield-statement).

PS. I've come up with another interesting use case: block signals for
the duration of a block. This could be a function in the signal
module, e.g. signal.blocking([ist of signals to block]). The list
would default to all signals. Similar signal.ignoring().

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From mal at egenix.com  Fri May 13 13:08:17 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Fri, 13 May 2005 13:08:17 +0200
Subject: [Python-Dev] Python's Unicode width default (New Py_UNICODE doc)
In-Reply-To: <42810104.3090303@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>	<427BCD3B.1000201@egenix.com>	<427C029D.3090907@v.loewis.de>	<427D0E80.4080502@egenix.com>	<427DD4EF.4030109@v.loewis.de>	<427F9007.3070603@egenix.com>	<427FD6D8.2010003@v.loewis.de>
	<428079B5.6010602@egenix.com> <42810104.3090303@v.loewis.de>
Message-ID: <42848AA1.1020402@egenix.com>

Martin v. L?wis wrote:
> M.-A. Lemburg wrote:
> 
>>Martin, please reconsider... the choice is between:
> 
> 
> The point is that this all was discussed, and decided the
> other way 'round. There is no point in going back and forth
> between the two choices:
> 
> http://mail.python.org/pipermail/python-dev/2003-June/036461.html

So you call two emails to the python-dev list a discussion ?

AFAICT, only Barry mildly suggested to have an automatic
--enable-unicode=ucs4 switch and then Jeff Epler provided
the patch including the warning that the patch wasn't tested
and that it does not attempt to make a more educated
guess as to where to find tcl.h (unlike setup.py does
in order to build _tkinter.c).

> If we remove the code, people will *again* report that
> _tkinter stops building on Redhat (see #719880). I
> see no value in breaking what works now.

I'm not breaking anything, I'm just correcting the
way things have to be configured in an effort to
bring back the cross-platforma configure default.

>>a) We have a cross-platform default Unicode width
>>   setting of UCS2.
> 
> 
> It is hardly the default anymore cross-platform. Many
> installations on Linux are built as UCS-4 now - no
> matter what configure does.

I'm talking about the *configure* default, not the
default installation you find on any particular
platform (this remains a platform decision to be made
by the packagers).

>>b) The default Unicode width is undefined and the only
>>   thing we can tell the user is:
>>
>>   Run the configure script and then try the interpreter
>>   to check whether you've got a UCS2 or UCS4 build.
> 
> 
> It's not at all undefined. There is a precise, deterministic,
> repeatable algorithm that determines the default, and
> if people want to know, we can tell them.

The outcome of the configure tests is bound to be
highly random across installations since it depends
whether TCL was installed on the system and how it
was configured. Furthermore, if a user wants to build
against a different TCL version, configure won't detect
this change, since it's setup.py that does the _tkinter.c
compilation.

The main point is that we can no longer tell users:
if you run configure without any further options,
you will get a UCS2 build of Python.

I want to restore this fact which was true before
Jeff's patch was applied.

Telling users to look at the configure script printout
to determine whether they have just built a UCS2
or UCS4 is just not right given its implications.

>>I want to change the --enable-unicode switch back to
>>always use UCS2 as default and add a new option value
>>"tcl" which then triggers the behavior you've added to
>>support _tkinter, ie.
>>
>>    --enable-unicode=tcl
>>
>>bases the decision to use UCS2 or UCS4 on the installed
>>TCL interpreter (if there is one).
> 
> Please don't - unless you also go back and re-open the
> bug reports, change the documentation, tell the Linux
> packagers that settings have changed, and so on.
> 
> Why deliberately break what currently works?

It will continue to work - the only change, if any,
is to add --enable-unicode=tcl or --enable-unicode=ucs4
(if you know that TCL uses UCS4) to your configure
setup. The --enable-unicode=ucs4 configure setting
is part of RedHat and SuSE already, so there won't
be any changes necessary.

BTW, SuSE builds TCL using UCS2 which seems to be
the correct choice given this comment in tcl.h:
"""
 * At this time UCS-2 mode is the default and recommended mode.
 * UCS-4 is experimental and not recommended.  It works for the core,
 * but most extensions expect UCS-2.
"""
and _tkinter.c built for a UCS4 Python does work with
a UCS2 TCL.

About the documentation: this still refers to the UCS2
default build and will need to be updated to also
mention UCS4 anyway.

About the bug reports: feel free to assign them to me.
We can have a canned response if necessary, but I
doubt that it will be necessary.

Explicit is better than implicit :-)

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 13 2005)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From pje at telecommunity.com  Fri May 13 16:08:23 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Fri, 13 May 2005 10:08:23 -0400
Subject: [Python-Dev] "with" use case: replacing a file
In-Reply-To: <42847228.6070009@gmail.com>
References: <ca471dc205051215326d1bda4c@mail.gmail.com>
	<ca471dc2050509215823876c50@mail.gmail.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<42833090.3060905@gmail.com>
	<d11dcfba050512065272ce7a14@mail.gmail.com>
	<ca471dc205051207504facf994@mail.gmail.com>
	<5.1.1.6.0.20050512162602.01f7c420@mail.telecommunity.com>
	<4283D4B0.10306@gmail.com>
	<ca471dc205051215326d1bda4c@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050513100338.02478a90@mail.telecommunity.com>

At 07:23 PM 5/13/2005 +1000, Nick Coghlan wrote:
>Guido van Rossum wrote:
> >>P.S. The points regarding non-local flow control in Joel Spolsky's 
> latest Joel
> >>on Software article (especially the links at the end) may have had 
> something to
> >>do with my change of heart. . .
> >
> >
> > I'm a big fan of Joel. Care to share the specific URL for the article
> > you're referring to?
> >
>
>Sorry about that (I was in a hurry this morning). It was here:
>http://www.joelonsoftware.com/articles/Wrong.html

"Wrong" is an excellent title for that article; it's completely wrongheaded 
about exceptions.  :)  Great basic idea (make wrong code look wrong), but 
for all practical purposes the actual specific advice in the article is 
only meaningful for C, where you can't create real types and there are no 
exceptions.  In Python, there are *much* saner solutions to his strawman 
problems.


From mwh at python.net  Fri May 13 15:12:34 2005
From: mwh at python.net (Michael Hudson)
Date: Fri, 13 May 2005 14:12:34 +0100
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc20505130305233b330e@mail.gmail.com> (Guido van Rossum's
	message of "Fri, 13 May 2005 03:05:20 -0700")
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
Message-ID: <2mpsvv5kxp.fsf@starship.python.net>

Guido van Rossum <gvanrossum at gmail.com> writes:

> I just read Raymond Chen's rant against control flow macros:
> http://blogs.msdn.com/oldnewthing/archive/2005/01/06/347666.aspx
>
> I think this pretty much kills PEP 340, as well as Nick Coghlan's
> alternative: both proposals let you write a "template" that can be
> used to hide exception-catching code, which is a form of control flow
> (and a pretty important one if you read Chen's rant against exceptions
> referenced by the former, even if you don't agree with everything he
> says in the latter).

Well, I'm not sure what the content of the latter article is, other
than "getting things right can be hard".

BTW, the "else:" on try statements is so very handy for getting this
sort of thing (more) corrent.

> Which leaves us, IMO, with the choice between PEP 310 and my own
> "PEP-340-redux" proposal; these *only* introduce a finally-clause,
> which does not affect the control flow. I'm not counting exceptions
> that might happen in the finally-clause; exceptions can happen
> anywhere anyway. But I am counting the *catching* of an exception as
> control flow, since that means that code past BLOCK (in the same
> function) is reachable even if BLOCK was not executed to completion;
> and this is the argument against PEP 340 and against Nick's
> alternative.
>
> Let's compare and contrast the two remaining competitors:
>
> PEP 310
> =======
>
> Syntax:
> with EXPR [= VAR]:
>     BLOCK
>
> Translation:
> [VAR =] abc = EXPR
> if hasattr(abc, "__enter__"):
>     abc.__enter__()
> try:
>     BLOCK
> finally:
>     abc.__exit__()
>
> Pros:
> - dead simple
>
> Cons:
> - can't use a decorated generator for EXPR

Sorry, why not?  [note: I work this out, below, but I still think the
code is worth posting]

import sys

class BlockTemplate(object):
      def __init__(self, g, args, kw):
          self.g = g
          self.args = args
          self.kw = kw
      def __enter__(self):
          self.giter = self.g(*self.args)
          self.giter.next()
      def __exit__(self):
          try:
              self.giter.next()
          except StopIteration:
              pass
          else:
              raise RuntimeError, "generator not exhausted"

def template(g):
    def _(*args, **kw):
        return BlockTemplate(g, args, kw)
    return _

@template
def redirected_stdout(out):
    print 'hi'
    save_stdout = sys.stdout
    sys.stdout = out
    yield None
    sys.stdout = save_stdout
    print 'ho'


## with redirected_stdout(fileobj):
##     print 1

output = open("foo", "w")

abc = redirected_stdout(output)
abc.__enter__()
try:
    print 1
finally:
    abc.__exit__()

output.close()

print repr(open("foo").read())

(this was a bit harder to get right than I expected, mind).

Oh, I guess the point is that with a decorated generator you can yield
a value to be used as VAR, rather than just discarding the value as
here.  Hmm.

> PEP 340 redux
> =============
>
> Syntax:
> do EXPR [as VAR]:
>     BLOCK
>
> Translation:
> abc = EXPR
> [VAR =] abc.__enter__()
> try:
>     BLOCK
> finally:
>     abc.__exit__(*"sys.exc_info()") # Not exactly

These two expansions look very similar to me.  What am I missing?

> Pros:
> - can use a decorated generator as EXPR
> - separation of EXPR and VAR (VAR gets what EXPR.__enter__() returns)

Oh!  Hmm.  This is a bit subtle.

I guess I should think about some examples.

> Cons:
> - slightly less simple (__enter__ must return something for VAR;
>   __exit__ takes optional args)

If things were fiddled such that sys.exc_info() return non-Nones when
a finally clause is being executed because of an exception, we don't
really need this wart, do we?

> Everything else is equal or can be made equal. We can make them more
> equal by treating the arguments passed to __exit__() as a separate
> decision, and waffling about whether __enter__() should be optional (I
> think it's a bad idea even for PEP 310; it *could* be made optional
> for PEP 340 redux).

I don't really recall why it's optional in PEP 310.

> Let's also not quibble about the keyword used; again, that can be a
> separate decision. Note that only PEP 310 can use the "VAR = EXPR"
> syntax; PEP 340 redux *must* use "EXPR as VAR" since it doesn't assign
> the value of EXPR to VAR; PEP 310 can be rewritten using this syntax
> as well.
>
> So then the all-important question I want to pose is: do we like the
> idea of using a (degenerate, decorated) generator as a "template" for
> the do-statement enough to accept the slightly increased complexity?

Looking at my above code, no (even though I think I've rendered the
point moot...).  Compare and contrast:

@template
def redirected_stdout(out):
    save_stdout = sys.stdout
    sys.stdout = out

    yield None

    sys.stdout = save_stdout

class redirected_stdout(object):

    def __init__(self, output):
        self.output = output

    def __enter__(self):
        self.save_stdout = sys.stdout
        sys.stdout = self.output

    def __exit__(self):
        sys.stdout = self.save_stdout

The former is shorter and contains less (well, no) 'self.'s, but I
think I find the latter somewhat clearer.

> The added complexity is caused by the need to separate VAR from EXPR
> so that a generator can be used. I personally like this separation; I
> actually like that the "anonymous block controller" is logically
> separate from the variable bound by the construct.

Nevertheless, I think I actually like this argument!

> From Greg Ewing's response to the proposal to endow file objects
> with __enter__ and __exit__ methods, I believe he thinks so too.
>
> Straight up-or-down votes in the full senate are appreciated at this point.

+1 for the PEP 340 variant.

> On to the secondary questions:
>
> - Today I like the 'do' keyword better; 'with' might confuse folks
> coming from Pascal or VB

No opinion.

> - I have a more elaborate proposal for __exit__'s arguments. Let the
> translation be as follows:
>
> abc = EXPR
> [VAR =] abc.__enter__()
> oke = False  # Pronounced "okay"
> exc = ()
> try:
>     try:
>         BLOCK
>         oke = True
>     except:
>         exc = sys.exc_info()
>         raise
> finally:
>     abc.__exit__(oke, *exc)

-"a bit"

> PS. I've come up with another interesting use case: block signals for
> the duration of a block. This could be a function in the signal
> module, e.g. signal.blocking([ist of signals to block]). The list
> would default to all signals. Similar signal.ignoring().

First you need to hit the authors of various libcs with big sticks.

Cheers,
mwh

-- 
  <shapr> ucking keyoar
                                                -- from Twisted.Quotes

From pje at telecommunity.com  Fri May 13 16:50:02 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Fri, 13 May 2005 10:50:02 -0400
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc20505130305233b330e@mail.gmail.com>
References: <428437F2.9020409@canterbury.ac.nz>
	<ca471dc2050509215823876c50@mail.gmail.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com>
	<4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
Message-ID: <5.1.1.6.0.20050513101523.03d908a8@mail.telecommunity.com>

At 03:05 AM 5/13/2005 -0700, Guido van Rossum wrote:
>So then the all-important question I want to pose is: do we like the
>idea of using a (degenerate, decorated) generator as a "template" for
>the do-statement enough to accept the slightly increased complexity?

Since the "do protocol" is now distinct from the iterator protocol, I don't 
believe a decorator is still required.  The purpose of the decorator was to 
help reduce confusion between the block statement and a "for" loop.  Since 
you can no longer swallow exceptions, there is no downside to using an 
existing generator as the target of a "do" statement.  That is, the concern 
about generators catching StopIteration from a "yield" doesn't matter, as 
they will simply step through to their next yield statement, and then the 
original exception will propagate.


>The added complexity is caused by the need to separate VAR from EXPR
>so that a generator can be used. I personally like this separation; I
>actually like that the "anonymous block controller" is logically
>separate from the variable bound by the construct. From Greg Ewing's
>response to the proposal to endow file objects with __enter__ and
>__exit__ methods, I believe he thinks so too.
>
>Straight up-or-down votes in the full senate are appreciated at this point.

+1.


>On to the secondary questions:
>
>- Today I like the 'do' keyword better; 'with' might confuse folks
>coming from Pascal or VB

+1 on "do EXPR as VAR", where VAR may be any valid LHS of an assignment.


>This means that __exit__ can be called with the following arguments:
>
>abc.__exit__(True) - normal completion of BLOCK
>
>abc.__exit__(False) - BLOCK was left by a non-local goto 
>(break/continue/return)
>
>abc.__exit__(False, t, v, tb) - BLOCK was left by an exception
>
>(An alternative would be to always call it with 4 arguments, the last
>three being None in the first two cases.)

I'm not sure the extra argument is a good idea; doesn't this introduce the 
same sort of invisible control flow as swallowing exceptions?  Also, since 
the block controller can't actually change the control flow, I'm having a 
hard time thinking of any actual use cases for this information.


>If we adopt PEP 340 redux, it's up to the decorator for degenerate
>generators to decide how to pass this information into the generator;
>if we adopt PEP 342 ("continue EXPR") at the same time, we can let the
>yield-expression return a 4-tuple (oke, t, v, tb). Most templates can
>ignore this information (so they can just use a yield-statement).

I was going to propose having a generator-iterator's __exit__() raise the 
triple inside the generator, or raise StopIteration inside the generator if 
there is no triple.  I'd ideally also like close() as a synonym for 
__exit__() with no arguments.  Although these are properly the subject of 
PEPs 288 and 325 respectively, I felt this would elegantly bring them both 
to closure.

However, after thinking it through, I realized that I don't see any obvious 
way to make __exit__ reusable for PEPs 288 and 325, because for 288 at 
least, I'd want __exit__ to either return the next yielded value or raise 
StopIteration.  But, this isn't compatible with the "do protocol"'s needs, 
unless the "do protocol" suppressed StopIteration, and that doesn't seem 
like such a good idea.

It seems to me that passing exceptions into a generator does in fact 
require a distinct method.  But, I still do believe that 
generator-iterators can have predefined __enter__ and __exit__ methods 
without the need for a decorator.


>PS. I've come up with another interesting use case: block signals for
>the duration of a block. This could be a function in the signal
>module, e.g. signal.blocking([ist of signals to block]). The list
>would default to all signals. Similar signal.ignoring().

Sweet.  You could also use it for temporary signal handling, i.e. "set this 
signal handler for the duration of the block".  Sort of the same class of 
"make sure I restore a global I'm tampering with" use case as redirecting 
stdout, and Tim Peters' Decimal context use cases.


From gvanrossum at gmail.com  Fri May 13 17:41:50 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 13 May 2005 08:41:50 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <2mpsvv5kxp.fsf@starship.python.net>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
	<2mpsvv5kxp.fsf@starship.python.net>
Message-ID: <ca471dc205051308417c541292@mail.gmail.com>

[Michael Hudson, after much thinking aloud]
> Oh, I guess the point is that with a decorated generator you can yield
> a value to be used as VAR, rather than just discarding the value as
> here.  Hmm.

Right. (I thought it was worth quoting this for the benefit of other
who went down the same trail but didn't quite make it to this
destination.)

> If things were fiddled such that sys.exc_info() return non-Nones when
> a finally clause is being executed because of an exception, we don't
> really need this wart, do we?

The problem is that sys.exc_info() almost always returns *something*
-- it's usually the last exception that was *ever* caught, except in
certain circumstances.

Phillip wrote on the same issue:
> I'm not sure the extra argument is a good idea; doesn't this introduce the
> same sort of invisible control flow as swallowing exceptions?  Also, since
> the block controller can't actually change the control flow, I'm having a
> hard time thinking of any actual use cases for this information.

The 'oke' argument is so that the author of transactional() can decide
what to do with a non-local goto: commit, rollback or hit the author
over the head with a big stick.

[Michael again]
> Compare and contrast:
> 
> @template
> def redirected_stdout(out):
>     save_stdout = sys.stdout
>     sys.stdout = out
> 
>     yield None
> 
>     sys.stdout = save_stdout
> 
> class redirected_stdout(object):
> 
>     def __init__(self, output):
>         self.output = output
> 
>     def __enter__(self):
>         self.save_stdout = sys.stdout
>         sys.stdout = self.output
> 
>     def __exit__(self):
>         sys.stdout = self.save_stdout
> 
> The former is shorter and contains less (well, no) 'self.'s, but I
> think I find the latter somewhat clearer.

Tastes differ. I think the generator wins; more so when there's more
state to remember.

[Michael quoting Guido]
> > The added complexity is caused by the need to separate VAR from EXPR
> > so that a generator can be used. I personally like this separation; I
> > actually like that the "anonymous block controller" is logically
> > separate from the variable bound by the construct.
> 
> Nevertheless, I think I actually like this argument!

(Repeated for the benefit of others.)

> > Straight up-or-down votes in the full senate are appreciated at this point.
> 
> +1 for the PEP 340 variant.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From p.f.moore at gmail.com  Fri May 13 17:53:03 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Fri, 13 May 2005 16:53:03 +0100
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc205051308417c541292@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
	<2mpsvv5kxp.fsf@starship.python.net>
	<ca471dc205051308417c541292@mail.gmail.com>
Message-ID: <79990c6b0505130853fcee4c6@mail.gmail.com>

On 5/13/05, Guido van Rossum <gvanrossum at gmail.com> wrote:
> Tastes differ. I think the generator wins; more so when there's more
> state to remember.
[...]
> > > Straight up-or-down votes in the full senate are appreciated at this point.
> >
> > +1 for the PEP 340 variant.

I am also +1 for the PEP 340 variant. I can see the value in
generators when state management starts to become more complex.

No significant opinion on choice of keyword.

I don't follow the subtleties for the more elaborate __exit__, so I'll
pass on that one as well.

Paul.

From pje at telecommunity.com  Fri May 13 18:07:55 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Fri, 13 May 2005 12:07:55 -0400
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc205051308417c541292@mail.gmail.com>
References: <2mpsvv5kxp.fsf@starship.python.net>
	<ca471dc2050509215823876c50@mail.gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com>
	<4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
	<2mpsvv5kxp.fsf@starship.python.net>
Message-ID: <5.1.1.6.0.20050513115715.021d29e0@mail.telecommunity.com>

At 08:41 AM 5/13/2005 -0700, Guido van Rossum wrote:
>The 'oke' argument is so that the author of transactional() can decide
>what to do with a non-local goto: commit, rollback or hit the author
>over the head with a big stick.

Since this is just a replacement for a try/except/finally block, I'd expect 
that in a transactional case that a non-local goto would work the same as 
any other non-exception exit.

ISTM that the resource block use cases are:

* Save the current state of something, modify it, and then restore the old 
state once the block completes (try/finally, used for locking, redirection, 
signals, decimal context, etc.)

* Automatically roll back partially-done work in case of exception, and/or 
"roll forward" completed work (try/except/else, used for "transaction" 
scenarios)

* Release allocated resource(s) after use (try/finally, used to close files 
and suchlike)

None of these, AFAICT, benefit from differing behavior in the presence of 
nonlinear (but non-exceptional) control flow.  It just seems too magical to 
me in the new context.  When we were talking about a "block" construct or 
user-defined syntax, it made more sense because you could actually redefine 
the *meaning* of those constructs to some extent -- and because the 
*target* of the break and continue statements at least was the block 
itself, not some containing block.


From tim.peters at gmail.com  Fri May 13 18:10:32 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Fri, 13 May 2005 12:10:32 -0400
Subject: [Python-Dev] Tidier Exceptions
In-Reply-To: <ca471dc20505130206a7e8b7a@mail.gmail.com>
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>
	<ca471dc2050512160977cc60f@mail.gmail.com>
	<428437DC.7030304@canterbury.ac.nz>
	<ca471dc20505130206a7e8b7a@mail.gmail.com>
Message-ID: <1f7befae050513091052eeff6c@mail.gmail.com>

[Guido, on string exceptions]
> ...
> Last I looked Zope 2 still depended on them (especially in the
> bowels of ZODB); maybe Tim Peters knows if that's still the
> case.

Certainly none of that in ZODB, or in ZRS.  Definitely some in Zope 2.6:

<http://mail.zope.org/pipermail/zope-tests/2005-May/002110.html>

I don't think there are any string exceptions in Zope 2.7, Zope 2.8,
or Zope 3.  Development on Zope 2.6 stopped about a year ago, so the
2.6 story will never change; by the same token, no version of Python
after 2.3.5 will ever be approved for use with 2.6 anyway.

From steven.bethard at gmail.com  Fri May 13 18:22:54 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Fri, 13 May 2005 10:22:54 -0600
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc20505130305233b330e@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
Message-ID: <d11dcfba050513092247bf4f8b@mail.gmail.com>

On 5/13/05, Guido van Rossum <gvanrossum at gmail.com> wrote:
> So then the all-important question I want to pose is: do we like the
> idea of using a (degenerate, decorated) generator as a "template" for
> the do-statement enough to accept the slightly increased complexity?

+0.  I'm not thoroughly convinced that generators are that much easier
to read than a class.  But I don't find them hard to read, and I think
it would only take a little effort to learn that generators might not
always be intended to build iterators.

If we do support generators in do-statements, I'd like their
__enter__() and __exit__() methods (if possible) to have semantics
like Nick Coghlan suggested[1], so that:
 * __enter__() raises an exception if next() has already been called, and
 * __exit__() raises an exception if StopIteration is not raised
The first makes sure that the generator is only used once, and the
second makes sure that there is only one yield on the given control
path through the generator.  In all but the most sick and twisted
code, raising exceptions like this will be identifying errors in how
the generator was written.

> Straight up-or-down votes in the full senate are appreciated at this point.

+1 on the PEP 340 redux semantics.

> On to the secondary questions:
> 
> - Today I like the 'do' keyword better; 'with' might confuse folks
> coming from Pascal or VB

+1 on using 'do'.

> - I have a more elaborate proposal for __exit__'s arguments. Let the
> translation be as follows:
[snip]
> abc.__exit__(True) - normal completion of BLOCK
> 
> abc.__exit__(False) - BLOCK was left by a non-local goto (break/continue/return)
> 
> abc.__exit__(False, t, v, tb) - BLOCK was left by an exception

-1. This looks like a fair bit of added complexity for not much gain. 
The only example that even *might* make use of this was the
transactional one, and I haven't yet seen a use case where it actually
*is*.

The simpler semantics give you the difference between a normal exit
and an exceptional exit.  I'd like to see an example that needs to
know the difference between block completion exit and a
break/continue/return exit before I'd want to make PEP 340 redux this
much more complex.

STeVe

[1] http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From gvanrossum at gmail.com  Fri May 13 18:23:45 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 13 May 2005 09:23:45 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <5.1.1.6.0.20050513115715.021d29e0@mail.telecommunity.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
	<2mpsvv5kxp.fsf@starship.python.net>
	<ca471dc205051308417c541292@mail.gmail.com>
	<5.1.1.6.0.20050513115715.021d29e0@mail.telecommunity.com>
Message-ID: <ca471dc20505130923149f5d95@mail.gmail.com>

[Guido]
> >The 'oke' argument is so that the author of transactional() can decide
> >what to do with a non-local goto: commit, rollback or hit the author
> >over the head with a big stick.

[Phillip J. Eby]
> Since this is just a replacement for a try/except/finally block, I'd expect
> that in a transactional case that a non-local goto would work the same as
> any other non-exception exit.
> 
> ISTM that the resource block use cases are:
> 
> * Save the current state of something, modify it, and then restore the old
> state once the block completes (try/finally, used for locking, redirection,
> signals, decimal context, etc.)
> 
> * Automatically roll back partially-done work in case of exception, and/or
> "roll forward" completed work (try/except/else, used for "transaction"
> scenarios)
> 
> * Release allocated resource(s) after use (try/finally, used to close files
> and suchlike)
> 
> None of these, AFAICT, benefit from differing behavior in the presence of
> nonlinear (but non-exceptional) control flow.  It just seems too magical to
> me in the new context.  When we were talking about a "block" construct or
> user-defined syntax, it made more sense because you could actually redefine
> the *meaning* of those constructs to some extent -- and because the
> *target* of the break and continue statements at least was the block
> itself, not some containing block.

That works for me; I was just hypothesizing about the needs of others,
but personally I'm fine with not knowing. I guess part of my
motivation is also that this information is readily available
intenally when a finally-clause is executed, since when the clause
completes a pending non-local goto has to be resumed. But there's no
reason to expose *all* internal state information...

So the signature of __exit__ is just what sys.exc_info() returns.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Fri May 13 18:29:19 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 13 May 2005 09:29:19 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <d11dcfba050513092247bf4f8b@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
	<d11dcfba050513092247bf4f8b@mail.gmail.com>
Message-ID: <ca471dc205051309291d66f89c@mail.gmail.com>

[Steven Bethard]
> +0.  I'm not thoroughly convinced that generators are that much easier
> to read than a class.  But I don't find them hard to read, and I think
> it would only take a little effort to learn that generators might not
> always be intended to build iterators.

I am proposing (like Phillip Eby in his response to PEP 340) to use a
special decorator that turns a generator into a "do-template", so the
intention is evident from the generator declaration.

> If we do support generators in do-statements, I'd like their
> __enter__() and __exit__() methods (if possible) to have semantics
> like Nick Coghlan suggested[1], so that:
>  * __enter__() raises an exception if next() has already been called, and
>  * __exit__() raises an exception if StopIteration is not raised

I guess you missed my post where I gave the code for the decorator; it
does exactly that.

> The simpler semantics give you the difference between a normal exit
> and an exceptional exit.  I'd like to see an example that needs to
> know the difference between block completion exit and a
> break/continue/return exit before I'd want to make PEP 340 redux this
> much more complex.

I agreed to that in my response to Phillip Eby. I do want to pass the
exception into __exit__ so that it can be logged, for example.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From ncoghlan at gmail.com  Fri May 13 18:35:47 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 14 May 2005 02:35:47 +1000
Subject: [Python-Dev] "with" use case: replacing a file
In-Reply-To: <5.1.1.6.0.20050513100338.02478a90@mail.telecommunity.com>
References: <ca471dc205051215326d1bda4c@mail.gmail.com>
	<ca471dc2050509215823876c50@mail.gmail.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<42833090.3060905@gmail.com>
	<d11dcfba050512065272ce7a14@mail.gmail.com>
	<ca471dc205051207504facf994@mail.gmail.com>
	<5.1.1.6.0.20050512162602.01f7c420@mail.telecommunity.com>
	<4283D4B0.10306@gmail.com>
	<ca471dc205051215326d1bda4c@mail.gmail.com>
	<5.1.1.6.0.20050513100338.02478a90@mail.telecommunity.com>
Message-ID: <4284D763.8070700@gmail.com>

Phillip J. Eby wrote:
> At 07:23 PM 5/13/2005 +1000, Nick Coghlan wrote:
>> Sorry about that (I was in a hurry this morning). It was here:
>> http://www.joelonsoftware.com/articles/Wrong.html
> 
> "Wrong" is an excellent title for that article; it's completely 
> wrongheaded about exceptions.  :)  Great basic idea (make wrong code 
> look wrong), but for all practical purposes the actual specific advice 
> in the article is only meaningful for C, where you can't create real 
> types and there are no exceptions.  In Python, there are *much* saner 
> solutions to his strawman problems.

I have to agree. However, having had to deal with C++'s excuse for exceptions 
(and the effects of C programmers being let loose on them), I can understand 
where he is coming from.

And the basic idea of flow control you can't see being a potential problem is 
sound. For exceptions in general, the benefits are worth the costs, but I don't 
think the same can be said for allowing statement templates the power to 
suppress them.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From jimjjewett at gmail.com  Thu May 12 16:43:45 2005
From: jimjjewett at gmail.com (Jim Jewett)
Date: Thu, 12 May 2005 10:43:45 -0400
Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally
Message-ID: <fb6fbf560505120743240e7646@mail.gmail.com>

[Guido]
>>> Can I go ahead and approve this now?

[Michael Hudson]
>> While I see the cost of this PEP being pretty small, I see the benefit
>> the same way too.

[Guido]
> Sure. Let me approve it and we'll see if someone cares enough to implement it.

No one will scream if you approve it, but when you asked permission
it seemed somehow serious and permanent.  By itself, the change is 
fine -- but there is still a nagging worry that it might interact badly with 
PEP-340 (or 3XX) Resource Management blocks.

(1)  If people could write a single try statement instead of nesting them,
would they be less likely to factor the locking out into a separate statement?

(2)  If 340 ends up as a simple version that doesn't handle fancy except:
processing, would this change make PEP 340 look crippled in comparison?
Would people avoid Resource Managers as a matter of style?

-jJ

From ncoghlan at gmail.com  Fri May 13 18:43:08 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 14 May 2005 02:43:08 +1000
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc20505130305233b330e@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>	<4281E14C.1070405@gmail.com>	<79990c6b05051105361e7a9ba0@mail.gmail.com>	<42827154.4040006@gmail.com>	<d11dcfba05051114441404ef0a@mail.gmail.com>	<ca471dc20505112028b263df6@mail.gmail.com>
	<4282D512.7060602@zope.com>	<4edc17eb05051208341b6bf751@mail.gmail.com>	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
Message-ID: <4284D91C.30702@gmail.com>

Guido van Rossum wrote:
> I just read Raymond Chen's rant against control flow macros:
> http://blogs.msdn.com/oldnewthing/archive/2005/01/06/347666.aspx
> 
> I think this pretty much kills PEP 340, as well as Nick Coghlan's
> alternative: both proposals let you write a "template" that can be
> used to hide exception-catching code, which is a form of control flow
> (and a pretty important one if you read Chen's rant against exceptions
> referenced by the former, even if you don't agree with everything he
> says in the latter).

It seems the effect of Raymond's articles on you was similar to their effect on 
me :)

> Straight up-or-down votes in the full senate are appreciated at this point.

PEP 340 redux for me (as you might have guessed) - I think the transaction() use 
case is a genuinely useful one. The ability to access the return value of 
__enter__() is also more useful than simply duplicating what could be achieved 
by an assignment on the line before the user defined statement.

> On to the secondary questions:
> 
> - Today I like the 'do' keyword better; 'with' might confuse folks
> coming from Pascal or VB

I think 'do' can be made to read correctly in more contexts that 'with'. The 
lack of a corresponding 'while' or 'until' should eliminate any temptation to 
see it as a loop.

The 'with' keyword also means I keep wanting the magic methods to be called 
"__acquire__" and "__release__" (and those would be harder to type. . .)

> - I have a more elaborate proposal for __exit__'s arguments. Let the
> translation be as follows:

I plan to rewrite my proposal based on this suggestion, just to explore the 
ramifications. I think it will turn out quite nicely.

The ban on yielding inside try/finally will need to be extended to yielding 
inside user defined statements until such time as an iterator finalisation 
protocol is chosen, though.

> (An alternative would be to always call it with 4 arguments, the last
> three being None in the first two cases.)

The former is probably tidier. __exit__() method implementations which don't 
care about the exception details can still use "*exc_info" in the argument 
signature, while those that want to use the information can just name the three 
parts without needing to specify the "=None" defaults.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From fredrik at pythonware.com  Fri May 13 18:42:56 2005
From: fredrik at pythonware.com (Fredrik Lundh)
Date: Fri, 13 May 2005 18:42:56 +0200
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
References: <ca471dc2050509215823876c50@mail.gmail.com><5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com><4281E14C.1070405@gmail.com><79990c6b05051105361e7a9ba0@mail.gmail.com><42827154.4040006@gmail.com><d11dcfba05051114441404ef0a@mail.gmail.com><ca471dc20505112028b263df6@mail.gmail.com>
	<4282D512.7060602@zope.com><4edc17eb05051208341b6bf751@mail.gmail.com><428437F2.9020409@canterbury.ac.nz><ca471dc20505130305233b330e@mail.gmail.com>
	<2mpsvv5kxp.fsf@starship.python.net>
Message-ID: <d62kvo$qvv$1@sea.gmane.org>

Michael Hudson wrote:

> Looking at my above code, no (even though I think I've rendered the
> point moot...).  Compare and contrast:
>
> @template
> def redirected_stdout(out):
>     save_stdout = sys.stdout
>     sys.stdout = out
>
>     yield None
>
>     sys.stdout = save_stdout
>
> class redirected_stdout(object):
>
>     def __init__(self, output):
>         self.output = output
>
>     def __enter__(self):
>         self.save_stdout = sys.stdout
>         sys.stdout = self.output
>
>     def __exit__(self):
>         sys.stdout = self.save_stdout
>
> The former is shorter and contains less (well, no) 'self.'s, but I
> think I find the latter somewhat clearer.

the same argument could be used (and was probably used) against
generators: why not just use __getitem__ and instance state?

as soon as you write something longer than four lines, using more
than one state variable, you'll find that generator-based code is a
lot more readable.

</F>




From gvanrossum at gmail.com  Fri May 13 18:56:53 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 13 May 2005 09:56:53 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <4284D91C.30702@gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com> <4284D91C.30702@gmail.com>
Message-ID: <ca471dc20505130956f924603@mail.gmail.com>

[Nick Coghlan]
> The ban on yielding inside try/finally will need to be extended to yielding
> inside user defined statements until such time as an iterator finalisation
> protocol is chosen, though.

Ah! Good point. This breaks PEP 340 example 5. No big deal, but worth noting.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From fredrik at pythonware.com  Fri May 13 18:47:45 2005
From: fredrik at pythonware.com (Fredrik Lundh)
Date: Fri, 13 May 2005 18:47:45 +0200
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
References: <ca471dc2050509215823876c50@mail.gmail.com><5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com><4281E14C.1070405@gmail.com><79990c6b05051105361e7a9ba0@mail.gmail.com><42827154.4040006@gmail.com><d11dcfba05051114441404ef0a@mail.gmail.com><ca471dc20505112028b263df6@mail.gmail.com>
	<4282D512.7060602@zope.com><4edc17eb05051208341b6bf751@mail.gmail.com><428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
Message-ID: <d62l8n$ru5$1@sea.gmane.org>

Guido van Rossum wrote:

> PEP 340 redux
> =============
>
> Syntax:
> do EXPR [as VAR]:
>     BLOCK
>
> Translation:
> abc = EXPR
> [VAR =] abc.__enter__()
> try:
>     BLOCK
> finally:
>     abc.__exit__(*"sys.exc_info()") # Not exactly
>
> Pros:
> - can use a decorated generator as EXPR
> - separation of EXPR and VAR (VAR gets what EXPR.__enter__() returns)
>
> Cons:
> - slightly less simple (__enter__ must return something for VAR;
>   __exit__ takes optional args)

what happened to the original "yield the target object" solution?  or did
I just dream that?

</F>




From ncoghlan at gmail.com  Fri May 13 19:08:39 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 14 May 2005 03:08:39 +1000
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <5.1.1.6.0.20050513115715.021d29e0@mail.telecommunity.com>
References: <2mpsvv5kxp.fsf@starship.python.net>	<ca471dc2050509215823876c50@mail.gmail.com>	<79990c6b05051105361e7a9ba0@mail.gmail.com>	<42827154.4040006@gmail.com>	<d11dcfba05051114441404ef0a@mail.gmail.com>	<ca471dc20505112028b263df6@mail.gmail.com>	<4282D512.7060602@zope.com>	<4edc17eb05051208341b6bf751@mail.gmail.com>	<428437F2.9020409@canterbury.ac.nz>	<ca471dc20505130305233b330e@mail.gmail.com>	<2mpsvv5kxp.fsf@starship.python.net>
	<5.1.1.6.0.20050513115715.021d29e0@mail.telecommunity.com>
Message-ID: <4284DF17.2030309@gmail.com>

Phillip J. Eby wrote:
> At 08:41 AM 5/13/2005 -0700, Guido van Rossum wrote:
> 
>>The 'oke' argument is so that the author of transactional() can decide
>>what to do with a non-local goto: commit, rollback or hit the author
>>over the head with a big stick.
<snip>
> * Automatically roll back partially-done work in case of exception, and/or 
> "roll forward" completed work (try/except/else, used for "transaction" 
> scenarios)

Doing transactions with try/except/else is not quite correct, since using any of 
the three non-local goto's actually executes neither the commit nor the rollback 
(of course, this is where Guido's stick comment comes into play. . .).

However, I'm fine with declaring that, from the perspective of a statement 
template, 'return', 'break' and 'continue' are all 'non-exceptional exits', and 
so templates like transaction() are expected to treat them as such.

Picking one way and enforcing it by restricting the information seen by 
__exit__() also seems to be a much better option than allowing the possibility of:

   do bobs.transaction():
       break
       # Triggers a rollback!

   do alices.transaction():
       break
       # Triggers a commit!

Going the 'non-exceptional exits' route also saves inventing a pseudo-exception 
to stand in for the 3 non-local goto statements (such a pseudo-exception would 
recreate the above behavioural hole, anyway).

An exceptional exit can be forced if a non-local goto needs to be executed in 
response to a failure:

   class AbortDo(Exception): pass

   do alices.transaction():
       break
       # Triggers a commit (non-exceptional exit)

   try:
       do alices.transaction():
           raise AbortDo
           # Triggers a rollback (exceptional exit)
   except AbortDo:
       break

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From gvanrossum at gmail.com  Fri May 13 19:08:41 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 13 May 2005 10:08:41 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <d62l8n$ru5$1@sea.gmane.org>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
	<d62l8n$ru5$1@sea.gmane.org>
Message-ID: <ca471dc205051310084ac597be@mail.gmail.com>

[Guido van Rossum]
> > Cons:
> > - slightly less simple (__enter__ must return something for VAR;
> >   __exit__ takes optional args)

[Fredrik Lundh]
> what happened to the original "yield the target object" solution?  or did
> I just dream that?

Don't worry, that works when you use a generator. It just doesn't work
when you're using a class.

The do-statement proposal is a bit ambiguous: on the one hand it's not
strongly tied to generators, since you can easily write a class with
__enter__ and __exit__ methods; on the other hand its essential
difference from PEP 310 is that you *can* use a generator, given a
suitable decorator.

BTW, we need a name for such a decorated generator that only yields
once. I propose to call it a degenerator.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Fri May 13 20:38:51 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 13 May 2005 11:38:51 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <4284DF17.2030309@gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
	<2mpsvv5kxp.fsf@starship.python.net>
	<5.1.1.6.0.20050513115715.021d29e0@mail.telecommunity.com>
	<4284DF17.2030309@gmail.com>
Message-ID: <ca471dc20505131138ec108dc@mail.gmail.com>

[Nick Coghlan]
> However, I'm fine with declaring that, from the perspective of a statement
> template, 'return', 'break' and 'continue' are all 'non-exceptional exits', and
> so templates like transaction() are expected to treat them as such.

Me too. The argument that made me realize this is okay came after
reading Raymond Chen's rant about control-flow macros: when you catch
an exception, you don't know how much of the try-block was executed
successfully, and neither does the author of that block; but when you
"catch" a non-local goto, you must assume that the block's author
knows what they are doing, so at least *they* know exactly which code
was executed (everything up to the break/continue/return) and which
wasn't. So the argument about rolling back indeterminate results
doesn't hold. If they want the transaction to fail, they should raise
an exception.

I really need to start writing PEP 343 to capture this particular
solution more carefully.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From martin at v.loewis.de  Fri May 13 22:19:18 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Fri, 13 May 2005 22:19:18 +0200
Subject: [Python-Dev] Python's Unicode width default (New Py_UNICODE doc)
In-Reply-To: <42848AA1.1020402@egenix.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>	<427BCD3B.1000201@egenix.com>	<427C029D.3090907@v.loewis.de>	<427D0E80.4080502@egenix.com>	<427DD4EF.4030109@v.loewis.de>	<427F9007.3070603@egenix.com>	<427FD6D8.2010003@v.loewis.de>
	<428079B5.6010602@egenix.com> <42810104.3090303@v.loewis.de>
	<42848AA1.1020402@egenix.com>
Message-ID: <42850BC6.4010907@v.loewis.de>

M.-A. Lemburg wrote:
> I'm not breaking anything, I'm just correcting the
> way things have to be configured in an effort to
> bring back the cross-platforma configure default.

Your proposed change will break the build of Python
on Redhat/Fedora systems.

> I'm talking about the *configure* default, not the
> default installation you find on any particular
> platform (this remains a platform decision to be made
> by the packagers).

Why is it good to have such a default? Why is that
so good that its better than having Tkinter work
by default?

> The main point is that we can no longer tell users:
> if you run configure without any further options,
> you will get a UCS2 build of Python.

It's not a matter of telling the users "no longer".
"We" currently don't tell that in any documentation;
if you had been telling that users, you were wrong.

./configure --help says that the default for
--enable-unicode is "yes".

> I want to restore this fact which was true before
> Jeff's patch was applied.

I understand that you want that. I'm opposed.

> Telling users to look at the configure script printout
> to determine whether they have just built a UCS2
> or UCS4 is just not right given its implications.

Right. We should tell them what the procedure is that
is used.

> It will continue to work - the only change, if any,
> is to add --enable-unicode=tcl or --enable-unicode=ucs4
> (if you know that TCL uses UCS4) to your configure
> setup. The --enable-unicode=ucs4 configure setting
> is part of RedHat and SuSE already, so there won't
> be any changes necessary.

Yes, but users of these systems need to adjust.

Regards,
Martin

From bac at OCF.Berkeley.EDU  Fri May 13 22:29:06 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Fri, 13 May 2005 13:29:06 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc20505130305233b330e@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>	<4281E14C.1070405@gmail.com>	<79990c6b05051105361e7a9ba0@mail.gmail.com>	<42827154.4040006@gmail.com>	<d11dcfba05051114441404ef0a@mail.gmail.com>	<ca471dc20505112028b263df6@mail.gmail.com>
	<4282D512.7060602@zope.com>	<4edc17eb05051208341b6bf751@mail.gmail.com>	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
Message-ID: <42850E12.4080609@ocf.berkeley.edu>

Guido van Rossum wrote:
[SNIP]
> Straight up-or-down votes in the full senate are appreciated at this point.
> 

PEP 340 redux gets my +1; I think using generators will become more obviously
useful to people when, as Fredrick pointed out, your code grows more than a few
lines long.

> On to the secondary questions:
> 
> - Today I like the 'do' keyword better; 'with' might confuse folks
> coming from Pascal or VB
> 

 -0; I can deal with either, but I just like how 'with' reads more.

> - I have a more elaborate proposal for __exit__'s arguments. Let the
> translation be as follows:
> 
> abc = EXPR
> [VAR =] abc.__enter__()
> oke = False  # Pronounced "okay"
> exc = ()
> try:
>     try:
>         BLOCK
>         oke = True
>     except:
>         exc = sys.exc_info()
>         raise
> finally:
>     abc.__exit__(oke, *exc)
> 
> This means that __exit__ can be called with the following arguments:
> 
> abc.__exit__(True) - normal completion of BLOCK
> 
> abc.__exit__(False) - BLOCK was left by a non-local goto (break/continue/return)
> 
> abc.__exit__(False, t, v, tb) - BLOCK was left by an exception
> 
> (An alternative would be to always call it with 4 arguments, the last
> three being None in the first two cases.)
> 
> If we adopt PEP 340 redux, it's up to the decorator for degenerate
> generators to decide how to pass this information into the generator;
> if we adopt PEP 342 ("continue EXPR") at the same time, we can let the
> yield-expression return a 4-tuple (oke, t, v, tb). Most templates can
> ignore this information (so they can just use a yield-statement).
> 

I think a later email discussed just passing in the values from sys.exc_info(),
and I like that more since it will be None if no exception was raised and thus
straight-forward to detect without being overly verbose with the oke argument.

-Brett

From bac at OCF.Berkeley.EDU  Fri May 13 22:34:36 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Fri, 13 May 2005 13:34:36 -0700
Subject: [Python-Dev] Tidier Exceptions
In-Reply-To: <ca471dc2050512174655dfc7d6@mail.gmail.com>
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>	
	<ca471dc2050512160977cc60f@mail.gmail.com>	
	<4283E9F6.1020500@ocf.berkeley.edu>
	<ca471dc2050512174655dfc7d6@mail.gmail.com>
Message-ID: <42850F5C.3080304@ocf.berkeley.edu>

Guido van Rossum wrote:
> [Brett C.]
> 
>>Seems like, especially if we require inheritance from a base exception class in
>>Python 3000, exceptions should have standard 'arg' and 'traceback' attributes
>>with a possible 'context' attribute (or always a 'context' attribute set to
>>None if not a chained exception).
>>
>>I don't think there is other data normally associated with exceptions is there?
> 
> 
> I despise the "arg" argument -- I like Java's "message" concept better.
> 

Works for me.

> 
>>I really need to get off my ass one of these days and just write a PEP targeted
>>for Python 3000 with base inheritance, standard attributes (including exception
>>chains), reworking the built-in exception inheritance hierarchy, and whether
>>bare 'except' statements should go or only catch certain exceptions.  Could
>>probably stand to break it up until multiple PEPs, though.  =)
> 
> 
> +1.
> 
> I think these things are sufficiently closely related to keep them all
> in one PEP.
> 

OK, I will see if I can get to the PEP this summer assuming I have time
(waiting on official offer on an internship that will keep me busy this summer,
but I am hoping to spend my free time Python hacking; got to finish the AST
branch some day  =).  Probably will also write it up in parts as outlined above
so that I can just get parts out quickly without being held up by discussions
about other sections.

-Brett

From bac at OCF.Berkeley.EDU  Fri May 13 22:40:10 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Fri, 13 May 2005 13:40:10 -0700
Subject: [Python-Dev] Tidier Exceptions
In-Reply-To: <ca471dc2050512182557c89ebd@mail.gmail.com>
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>	
	<ca471dc2050512160977cc60f@mail.gmail.com>	
	<4283E9F6.1020500@ocf.berkeley.edu>	
	<Pine.LNX.4.58.0505122000530.14555@server1.LFW.org>
	<ca471dc2050512182557c89ebd@mail.gmail.com>
Message-ID: <428510AA.8010708@ocf.berkeley.edu>

Guido van Rossum wrote:
> [Ka-Ping Yee]
> 
>>Maybe bare 'except' should be spelled 'except *'.
> 
> 
> -1.
> 
> 
>>I don't think it can be removed altogether because sometimes you just
>>need to be able to do magic, but it can be made a little more explicit.
> 
> 
> Assuming a single root of the exception tree, you can spell it
> explicitly as "except Exception" or perhaps (if that's not the root)
> "except Raisable" (cf. Java's Throwable).
> 

There are two possibilities for this.  Either we make all 'except' branches
explicit, or we make bare 'except' branches okay, but not catch the base
exception but the non-critical base exception.

The thinking is that BaseException be the bottom exception, with Exception
being inherited from for non-showstopping exceptions and CriticalException
being for exceptions that you really should not catch unless you mean it
(KeyboardInterrupt, SystemError, MemoryError, etc.).  That would make the bare
'except' much more reasonable.

But personally I still prefer requiring explicit 'except' branches and just
making sure the non-critical base exception is the easier one to type to make
sure people use it by default.

-Brett

From shane.holloway at ieee.org  Fri May 13 22:51:58 2005
From: shane.holloway at ieee.org (Shane Holloway (IEEE))
Date: Fri, 13 May 2005 14:51:58 -0600
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc20505130305233b330e@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>	<4281E14C.1070405@gmail.com>	<79990c6b05051105361e7a9ba0@mail.gmail.com>	<42827154.4040006@gmail.com>	<d11dcfba05051114441404ef0a@mail.gmail.com>	<ca471dc20505112028b263df6@mail.gmail.com>
	<4282D512.7060602@zope.com>	<4edc17eb05051208341b6bf751@mail.gmail.com>	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
Message-ID: <4285136E.5010004@ieee.org>

+1 PEP 340 redux (although I marginally prefer the "with" keyword)

Guido van Rossum wrote:
> So then the all-important question I want to pose is: do we like the
> idea of using a (degenerate, decorated) generator as a "template" for
> the do-statement enough to accept the slightly increased complexity?
> The added complexity is caused by the need to separate VAR from EXPR
> so that a generator can be used. I personally like this separation; I
> actually like that the "anonymous block controller" is logically
> separate from the variable bound by the construct. From Greg Ewing's
> response to the proposal to endow file objects with __enter__ and
> __exit__ methods, I believe he thinks so too.

+1 to support template generators.  I think it allows for more 
flexibility on the class implementations as well, even if most of them 
return self.


> - Today I like the 'do' keyword better; 'with' might confuse folks
> coming from Pascal or VB

As a former pascal/delphi guy, I wasn't really confused.  ;)

+1 with
+0 do

> - I have a more elaborate proposal for __exit__'s arguments. Let the
> translation be as follows:

+1 -- as long as I can get information about how the block exited, I'm 
happy.

> If we adopt PEP 340 redux, it's up to the decorator for degenerate
> generators to decide how to pass this information into the generator;
> if we adopt PEP 342 ("continue EXPR") at the same time, we can let the
> yield-expression return a 4-tuple (oke, t, v, tb). Most templates can
> ignore this information (so they can just use a yield-statement).

+1 makes sense to me...

From bac at OCF.Berkeley.EDU  Fri May 13 22:52:24 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Fri, 13 May 2005 13:52:24 -0700
Subject: [Python-Dev] Chained Exceptions
In-Reply-To: <ca471dc205051218316ca4c611@mail.gmail.com>
References: <Pine.LNX.4.58.0505121715360.14555@server1.LFW.org>	
	<ca471dc205051216121093961b@mail.gmail.com>	
	<4283EB2C.7040602@ocf.berkeley.edu>	
	<ca471dc2050512174318ba97a8@mail.gmail.com>	
	<Pine.LNX.4.58.0505121944440.14555@server1.LFW.org>
	<ca471dc205051218316ca4c611@mail.gmail.com>
Message-ID: <42851388.7030407@ocf.berkeley.edu>

Guido van Rossum wrote:
> [Guido]
> 
>>>What if that method catches that exception?
> 
> 
> [Ka-Ping Yee]
> 
>>Did you mean something like this?
>>
>>    def handle():
>>        try:
>>            open('spamspamspam')
>>        except:
>>            catchit()
>>            # point A
>>            ...
>>
>>    def catchit():
>>        try:
>>            1/0
>>        except:
>>            pass
>>
>>Then there's no exception to propagate, so it doesn't matter.
>>Once we're get to point A, the division by zero is long forgotten.
> 
> 
> But at what point does the attaching happen? If I catch the
> ZeroDivisionException inside catchit() and inspects its context
> attribute, does it reference the IOError instance raised by
> open('spamspamspam')?

Yes, at least in the way I am imagining this being implemented.  I was thinking
that when an exception happens, the global exception variable is checked to see
if it has a value.  If it does that gets assigned to the new exception's
'context' attribute and the new exception gets assigned to the global exception
variable.

> This could potentially cause a lot of extra
> work: when an inner loop that raises and catches lots of exceptions is
> invoked in the context of having caught an exception at some outer
> level, the inner loop keeps attaching the outer exception to each
> exception raised.
> 

[this also contains a partial answer to Philip's email also in this thread]

Maybe, but as long as caught exceptions get cleared that should be an issue.
Would this be solved if, when an 'except' branch is exited, exceptions are
cleared?  So, in the above example, once the 'pass' is hit in catchit() no
exception is considered active any longer.  This could be done with a CLEAR_EXC
opcode very easily inserted at the end of an 'except' branch by the compiler.

This would require explicit re-raising of exceptions to keep them alive after
an 'except' ends, but I think that is actually a good idea and since this might
all wait until Python 3000 anyway we don't need to worry about the semantic change.

-Brett

From mal at egenix.com  Fri May 13 23:21:18 2005
From: mal at egenix.com (M.-A. Lemburg)
Date: Fri, 13 May 2005 23:21:18 +0200
Subject: [Python-Dev] Python's Unicode width default (New Py_UNICODE doc)
In-Reply-To: <42850BC6.4010907@v.loewis.de>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>	<427BCD3B.1000201@egenix.com>	<427C029D.3090907@v.loewis.de>	<427D0E80.4080502@egenix.com>	<427DD4EF.4030109@v.loewis.de>	<427F9007.3070603@egenix.com>	<427FD6D8.2010003@v.loewis.de>
	<428079B5.6010602@egenix.com>	<42810104.3090303@v.loewis.de>
	<42848AA1.1020402@egenix.com> <42850BC6.4010907@v.loewis.de>
Message-ID: <42851A4E.9020609@egenix.com>

Martin v. L?wis wrote:
> M.-A. Lemburg wrote:
> 
>>I'm not breaking anything, I'm just correcting the
>>way things have to be configured in an effort to
>>bring back the cross-platforma configure default.
> 
> Your proposed change will break the build of Python
> on Redhat/Fedora systems.

You know that this is not true. Python will happily
continue to compile on these systems.

>>I'm talking about the *configure* default, not the
>>default installation you find on any particular
>>platform (this remains a platform decision to be made
>>by the packagers).
> 
> 
> Why is it good to have such a default? Why is that
> so good that its better than having Tkinter work
> by default?

It is important to be able to rely on a default that
is used when no special options are given. The decision
to use UCS2 or UCS4 is much too important to be
left to a configure script.

>>The main point is that we can no longer tell users:
>>if you run configure without any further options,
>>you will get a UCS2 build of Python.
> 
> 
> It's not a matter of telling the users "no longer".
> "We" currently don't tell that in any documentation;
> if you had been telling that users, you were wrong.
>
> ./configure --help says that the default for
> --enable-unicode is "yes".

Let's see:
http://www.python.org/peps/pep-0100.html
http://www.python.org/peps/pep-0261.html
http://www.python.org/doc/2.2.3/whatsnew/node8.html

Apart from the mention in the What's New document for
Python 2.2 and a FAQ entry, the documentation doesn't
mention UCS4 at all.

However, you're right: the configure script should print
"(default if ucs2)".

>>I want to restore this fact which was true before
>>Jeff's patch was applied.
> 
> 
> I understand that you want that. I'm opposed.

Noted.

>>Telling users to look at the configure script printout
>>to determine whether they have just built a UCS2
>>or UCS4 is just not right given its implications.
> 
> Right. We should tell them what the procedure is that
> is used.

No, we should make it an explicit decision by the
user running the configure script.

BTW, a UCS4 TCL is just as non-standard as a UCS4
Python build. Non-standard build options should never be
selected by a configure script all by itself.

>>It will continue to work - the only change, if any,
>>is to add --enable-unicode=tcl or --enable-unicode=ucs4
>>(if you know that TCL uses UCS4) to your configure
>>setup. The --enable-unicode=ucs4 configure setting
>>is part of RedHat and SuSE already, so there won't
>>be any changes necessary.
> 
> Yes, but users of these systems need to adjust.

Not really: they won't even notice the change in the
configure script if they use the system provided Python
versions. Or am I missing something ?


Regardless of all this discussion, I think we should
try to get _tkinter.c to work with a UCS4 TCL version
as well. The conversion from UCS4 (Python) to UCS2 (TCL)
is already integrated, so adding support for the other way
around should be  rather straight forward.

Any takers ?

Regards,
-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 13 2005)
 >>> Python/Zope Consulting and Support ...        http://www.egenix.com/
 >>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
 >>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ::::

From gvanrossum at gmail.com  Sat May 14 00:28:22 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 13 May 2005 15:28:22 -0700
Subject: [Python-Dev] Chained Exceptions
In-Reply-To: <42851388.7030407@ocf.berkeley.edu>
References: <Pine.LNX.4.58.0505121715360.14555@server1.LFW.org>
	<ca471dc205051216121093961b@mail.gmail.com>
	<4283EB2C.7040602@ocf.berkeley.edu>
	<ca471dc2050512174318ba97a8@mail.gmail.com>
	<Pine.LNX.4.58.0505121944440.14555@server1.LFW.org>
	<ca471dc205051218316ca4c611@mail.gmail.com>
	<42851388.7030407@ocf.berkeley.edu>
Message-ID: <ca471dc205051315287d0c2d27@mail.gmail.com>

[Brett C.]
> Maybe, but as long as caught exceptions get cleared that should be an issue.
> Would this be solved if, when an 'except' branch is exited, exceptions are
> cleared?  So, in the above example, once the 'pass' is hit in catchit() no
> exception is considered active any longer.  This could be done with a CLEAR_EXC
> opcode very easily inserted at the end of an 'except' branch by the compiler.

Sure, but that would be backwards incompatible. There's plenty of code
that expects sys.exc_info() to continue to return the caught exception
*outside* the except block. This is all incredibly tricky, to some
extent for backwards compatibility reasons (please read the source
code for maintaining the exc_info data!).

In Python 3000, I think we can get rid of sys.exc_info() altogether
once we place the traceback in the exception object as the 'traceback'
attribute: if you want this info, all you need is write

    except SomeException, err:
        # now type is err.__class__, value is err, and traceback is
err.traceback.

If you want to have this with an "except:" clause, you can just catch
'Exception' or perhaps 'BaseException'. This isn't possible in Python
2.x since there's no single base class.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Sat May 14 02:13:00 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 13 May 2005 17:13:00 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
Message-ID: <ca471dc205051317133cf8fd63@mail.gmail.com>

I've written up the specs for my "PEP 340 redux" proposal as a
separate PEP, PEP 343.

http://python.org/peps/pep-0343.html

Those who have been following the thread "Merging PEP 310 and PEP
340-redux?" will recognize my proposal in that thread, which received
mostly positive responses there.

Please review and ask for clarifications of anything that's unclear.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From greg.ewing at canterbury.ac.nz  Sat May 14 02:10:03 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 14 May 2005 12:10:03 +1200
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
Message-ID: <428541DB.6090703@canterbury.ac.nz>

Guido van Rossum wrote:
> So then the all-important question I want to pose is: do we like the
> idea of using a (degenerate, decorated) generator as a "template" for
> the do-statement enough to accept the slightly increased complexity?

I can't see how this has anything to do with whether
a generator is used or not. Keeping them separate
seems to be a useful thing in its own right.

Greg


From gvanrossum at gmail.com  Sat May 14 02:28:14 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 13 May 2005 17:28:14 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <428541DB.6090703@canterbury.ac.nz>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
	<428541DB.6090703@canterbury.ac.nz>
Message-ID: <ca471dc2050513172842337745@mail.gmail.com>

[Guido van Rossum]
> > So then the all-important question I want to pose is: do we like the
> > idea of using a (degenerate, decorated) generator as a "template" for
> > the do-statement enough to accept the slightly increased complexity?

[Greg Ewing]
> I can't see how this has anything to do with whether
> a generator is used or not. Keeping them separate
> seems to be a useful thing in its own right.

Assuming by "them" you mean the value of EXPR and the value assigned
to VAR, I don't care how this conclusion is reached, as long as their
separation is seen as a useful thing. :-)

I came up with the idea of making them separate when I tried to figure
out how to decorate a generator to drive a PEP-310-style
with-statement, and found I couldn't do it for the opening() example.
(Michael Hudson illustrated this nicely in his reply in this thread.
:-)

But it's fine if the separation is considered generally useful even
without thinking of generators.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From greg.ewing at canterbury.ac.nz  Sat May 14 02:26:16 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 14 May 2005 12:26:16 +1200
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
References: <428437F2.9020409@canterbury.ac.nz>
	<ca471dc2050509215823876c50@mail.gmail.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<5.1.1.6.0.20050513101523.03d908a8@mail.telecommunity.com>
Message-ID: <428545A8.30700@canterbury.ac.nz>

Here's my vote on things at the moment:

+1 on

   do EXPR as VAR:
     ...

+1 on keeping the EXPR and VAR distinct.

+1 on keeping the do and generator protocols distinct.

+1 on not going out of our way to let the controller
catch exceptions or alter control flow. Let's keep it
as simple as we can.

-0.7 on directly giving generators do-protocol methods.
I'm not yet convinced that encouraging people to use
generators to implement block controllers is a good
idea. If we blur the distinction too much at this
stage, we may regret it later if we come up with a
better idea. Also I don't see that people will be
writing block controllers anywhere near as often as
iterators, so writing classes for them isn't going to
be a big chore. And people can always use a
do-protocol-to-generator adaptor if they really want.

Greg



From ncoghlan at gmail.com  Sat May 14 02:55:39 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 14 May 2005 10:55:39 +1000
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc205051310084ac597be@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>	<79990c6b05051105361e7a9ba0@mail.gmail.com>	<42827154.4040006@gmail.com>	<d11dcfba05051114441404ef0a@mail.gmail.com>	<ca471dc20505112028b263df6@mail.gmail.com>
	<4282D512.7060602@zope.com>	<4edc17eb05051208341b6bf751@mail.gmail.com>	<428437F2.9020409@canterbury.ac.nz>	<ca471dc20505130305233b330e@mail.gmail.com>	<d62l8n$ru5$1@sea.gmane.org>
	<ca471dc205051310084ac597be@mail.gmail.com>
Message-ID: <42854C8B.9040302@gmail.com>

Guido van Rossum wrote:
> BTW, we need a name for such a decorated generator that only yields
> once. I propose to call it a degenerator.

Cute, but 'template generator' may be clearer (that's what I've been calling 
them so far, anyway). Then 'iterator generator' can be used to explicitly refer 
to generators intended for use in for loops.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From gvanrossum at gmail.com  Sat May 14 02:58:47 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 13 May 2005 17:58:47 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <428545A8.30700@canterbury.ac.nz>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<5.1.1.6.0.20050513101523.03d908a8@mail.telecommunity.com>
	<428545A8.30700@canterbury.ac.nz>
Message-ID: <ca471dc2050513175860e6bc4c@mail.gmail.com>

[Greg Ewing]
> -0.7 on directly giving generators do-protocol methods.

I'm -1 on this myself.

> I'm not yet convinced that encouraging people to use
> generators to implement block controllers is a good
> idea. If we blur the distinction too much at this
> stage, we may regret it later if we come up with a
> better idea. Also I don't see that people will be
> writing block controllers anywhere near as often as
> iterators, so writing classes for them isn't going to
> be a big chore. And people can always use a
> do-protocol-to-generator adaptor if they really want.

Right. I'm +0 on adding a standard module defining a do_template
decorator that turns a degenerate generator into a do-statement
controller.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Sat May 14 03:10:07 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Fri, 13 May 2005 21:10:07 -0400
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc205051317133cf8fd63@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050513204536.021d5900@mail.telecommunity.com>

At 05:13 PM 5/13/2005 -0700, Guido van Rossum wrote:
>I've written up the specs for my "PEP 340 redux" proposal as a
>separate PEP, PEP 343.
>
>http://python.org/peps/pep-0343.html
>
>Those who have been following the thread "Merging PEP 310 and PEP
>340-redux?" will recognize my proposal in that thread, which received
>mostly positive responses there.
>
>Please review and ask for clarifications of anything that's unclear.

May I suggest this alternative translation in the "Specification" section:

         abc = EXPR
         __args = ()  # pseudo-variable, not visible to the user

         try:
             VAR = abc.__enter__()
             try:
                 BLOCK
             except:
                 __args = sys.exc_info()
         finally:
             abc.__exit__(*__args)

While slighly more complex than the current translation, the current 
translation seems a bit misleading to me.  OTOH, that may simply be because 
I see the *sys.exc_info() part and immediately wonder what happens when 
there was no exception, and sys.exc_info() contains some arbitrary previous 
data...

Also, one question: will the "do protocol" be added to built-in "resource" 
types?  That is, locks, files, sockets, and so on?  Or will there instead 
be "macros" like the "opening" and "locking" templates?  I notice that 
grammatically, "do gerund" works a lot better than "do noun", so all of 
your examples are words like locking, blocking, opening, redirecting, and 
so on.  This makes it seem awkward for e.g. "do self.__lock", which doesn't 
make any sense.  But the extra call needed to make it "do 
locking(self.__lock)" seems sort of gratuitous.

It makes me wonder if "with" or "using" or some similar word that works 
better with nouns might be more appropriate, as then it would let us just 
add the resource protocol to common objects, and still read well.  For 
example, a Decimal Context object might implement __enter__ by setting 
itself as the thread-local context, and __exit__ by restoring the previous 
context.    "do aDecimalContext" doesn't make much sense, but "with 
aDecimalContext" or "using aDecimalContext" reads quite nicely.
         


From ncoghlan at gmail.com  Sat May 14 03:15:32 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 14 May 2005 11:15:32 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc205051317133cf8fd63@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
Message-ID: <42855134.4060803@gmail.com>

Guido van Rossum wrote:
> I've written up the specs for my "PEP 340 redux" proposal as a
> separate PEP, PEP 343.
> 
> http://python.org/peps/pep-0343.html
> 
> Those who have been following the thread "Merging PEP 310 and PEP
> 340-redux?" will recognize my proposal in that thread, which received
> mostly positive responses there.
> 
> Please review and ask for clarifications of anything that's unclear.

+1 here.

The stdout redirection example needs to be corrected to avoid yielding inside a 
try/finally though:

     5. Redirect stdout temporarily:

         @do_template
         def redirecting_stdout(new_stdout):
             save_stdout = sys.stdout
             try:
                 sys.stdout = new_stdout
             except:
                 sys.stdout = save_stdout
                 raise
             else:
                 yield None
                 sys.stdout = save_stdout

        Used as follows:

         do opening(filename, "w") as f:
             do redirecting_stdout(f):
                 print "Hello world"

This could be left as the more elegant original if iterator finalisation (e.g. 
using a "__finish__()" slot) came in at the same time as user defined 
statements, allowing the above to be written naturally with try/finally.

Arnold deVos's HTML tagging example would need access to the exception 
information and could be rewritten as a class:

   def tag(object):
       def __init__(self, name):
           self.name = cgi.escape(name)

       def __enter__(self):
           print '<%s>' % self.name
           return self.name

       def __exit__(self, *exc_info):
           if not exc_info or exc_info[0] is None:
               print '</%s>' % self.name

Used as follows::

   do tag('html'):
       do tag('head'):
          do tag('title'):
             print 'A web page'
       do tag('body'):
          for par in pars:
             do tag('p'):
                print par

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From python-dev at zesty.ca  Sat May 14 03:29:02 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Fri, 13 May 2005 20:29:02 -0500 (CDT)
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc20505130305233b330e@mail.gmail.com>
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<5.1.1.6.0.20050510123311.02471508@mail.telecommunity.com>
	<4281E14C.1070405@gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com>
	<4282D512.7060602@zope.com> <4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
Message-ID: <Pine.LNX.4.58.0505131957590.14555@server1.LFW.org>

On Fri, 13 May 2005, Guido van Rossum wrote:
> Straight up-or-down votes in the full senate are appreciated at this point.

I prefer the "PEP 340 redux" version.  Both the flexibility for __enter__
to return a separate object and the ability for __exit__ to react to
exceptions are useful.

> - Today I like the 'do' keyword better; 'with' might confuse folks
> coming from Pascal or VB

I prefer 'with'.  The 'do' keyword sounds "loopy" and doesn't make
grammatical sense.


-- ?!ng

From janssen at parc.com  Sat May 14 04:19:25 2005
From: janssen at parc.com (Bill Janssen)
Date: Fri, 13 May 2005 19:19:25 PDT
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: Your message of "Fri, 13 May 2005 18:29:02 PDT."
	<Pine.LNX.4.58.0505131957590.14555@server1.LFW.org> 
Message-ID: <05May13.191926pdt."58617"@synergy1.parc.xerox.com>

> I prefer 'with'.  The 'do' keyword sounds "loopy" and doesn't make
> grammatical sense.

I agree.  "with [VAR =] BLOCK:" just reads better.  "with BLOCK [as
VAR]:" is OK, too.  Or even "within", as in

  within BLOCK binding VAR:
      SOMETHING

Probably too techy.

Bill





















From gvanrossum at gmail.com  Sat May 14 05:58:12 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 13 May 2005 20:58:12 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <5.1.1.6.0.20050513204536.021d5900@mail.telecommunity.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<5.1.1.6.0.20050513204536.021d5900@mail.telecommunity.com>
Message-ID: <ca471dc2050513205855dcba6e@mail.gmail.com>

[Phillip J. Eby]
> May I suggest this alternative translation in the "Specification" section:
> 
>          abc = EXPR
>          __args = ()  # pseudo-variable, not visible to the user
> 
>          try:
>              VAR = abc.__enter__()
>              try:
>                  BLOCK
>              except:
>                  __args = sys.exc_info()
>          finally:
>              abc.__exit__(*__args)

Done (except you forgot to add a "raise" to the except claise).

> While slighly more complex than the current translation, the current
> translation seems a bit misleading to me.  OTOH, that may simply be because
> I see the *sys.exc_info() part and immediately wonder what happens when
> there was no exception, and sys.exc_info() contains some arbitrary previous
> data...

Right. Well, anyway, the actual implementation will just get the
exception info from the try/finally infrastructure -- it's squirreled
away somewhere on the stack even if sys.exc_info() (intentionally)
doesn't have access to it.

> Also, one question: will the "do protocol" be added to built-in "resource"
> types?  That is, locks, files, sockets, and so on?

One person proposed that and it was shot down by Greg Ewing. I think
it's better to require a separate wrapper.

> Or will there instead
> be "macros" like the "opening" and "locking" templates?  I notice that
> grammatically, "do gerund" works a lot better than "do noun", so all of
> your examples are words like locking, blocking, opening, redirecting, and
> so on.  This makes it seem awkward for e.g. "do self.__lock", which doesn't
> make any sense.  But the extra call needed to make it "do
> locking(self.__lock)" seems sort of gratuitous.

Maybe. There seems to be a surge of proponents for 'do' at the moment.

> It makes me wonder if "with" or "using" or some similar word that works
> better with nouns might be more appropriate, as then it would let us just
> add the resource protocol to common objects, and still read well.  For
> example, a Decimal Context object might implement __enter__ by setting
> itself as the thread-local context, and __exit__ by restoring the previous
> context.    "do aDecimalContext" doesn't make much sense, but "with
> aDecimalContext" or "using aDecimalContext" reads quite nicely.

Maybe. I think we ought to implement the basic mechanism first and
then decide how fancy we want to get, so I'd rather not get into this
in the PEP. I'll add this to the PEP.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Sat May 14 05:59:56 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 13 May 2005 20:59:56 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <42855134.4060803@gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<42855134.4060803@gmail.com>
Message-ID: <ca471dc205051320592f5e9806@mail.gmail.com>

[Nick Coghlan]
> The stdout redirection example needs to be corrected to avoid yielding inside a
> try/finally though:

Thanks -- fixed now.

> This could be left as the more elegant original if iterator finalisation (e.g.
> using a "__finish__()" slot) came in at the same time as user defined
> statements, allowing the above to be written naturally with try/finally.

Let's not try to tie this to other features. I tried that with PEP 340
and you know the mess it became. :-)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Sat May 14 07:32:27 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 13 May 2005 22:32:27 -0700
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <-7082760554711160022@unknownmsgid>
References: <Pine.LNX.4.58.0505131957590.14555@server1.LFW.org>
	<-7082760554711160022@unknownmsgid>
Message-ID: <ca471dc205051322324a786274@mail.gmail.com>

[Bill Janssen]
> I agree.  "with [VAR =] BLOCK:" just reads better.  "with BLOCK [as
> VAR]:" is OK, too.

Maybe someone can set up a public poll (isn't that something you can
do on Yahoo? Or some volunteer can probably write it in Zope in 3
minutes) asking whether people prefer 'do' or 'with'. I'll go with the
outcome; this is pretty much the only contentious point in PEP 343 at
this point.

(That's http://www.python.org/peps/pep-0343.html in case you missed my
announcement; somehow Gmail classified it as spam for me even though I
sent it myself! :-)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From fumanchu at amor.org  Sat May 14 08:00:05 2005
From: fumanchu at amor.org (Robert Brewer)
Date: Fri, 13 May 2005 23:00:05 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
Message-ID: <3A81C87DC164034AA4E2DDFE11D258E37720A9@exchange.hqamor.amorhq.net>

Guido van Rossum wrote:
> I've written up the specs for my "PEP 340 redux" proposal as a
> separate PEP, PEP 343.
> 
> http://python.org/peps/pep-0343.html
> 
> Those who have been following the thread "Merging PEP 310 and PEP
> 340-redux?" will recognize my proposal in that thread, which received
> mostly positive responses there.
> 
> Please review and ask for clarifications of anything that's unclear.

There's a typo in the code snippets at the moment.

        The translation of the above statement is:

        abc = EXPR
        exc = ()  # Or (None, None, None) ?
        try:
            try:
                VAR = abc.__enter__()
                BLOCK
            except:
                exc = sys.exc_info()
                raise
        finally:
            abc.__exit__(exc)

I think you meant "abc.__exit__(*exc)". Assuming that, then "exc =
(None, None, None)" makes the most sense. If exc_info() is going to be
passed as a single arg, then I'd rather have the default "exc = ()", so
I can simply check "if exc:" in the __exit__ method.


Robert Brewer
System Architect
Amor Ministries
fumanchu at amor.org

From ncoghlan at gmail.com  Sat May 14 10:33:25 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 14 May 2005 18:33:25 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <3A81C87DC164034AA4E2DDFE11D258E37720A9@exchange.hqamor.amorhq.net>
References: <3A81C87DC164034AA4E2DDFE11D258E37720A9@exchange.hqamor.amorhq.net>
Message-ID: <4285B7D5.1090704@gmail.com>

Robert Brewer wrote:
> There's a typo in the code snippets at the moment.
> 
>         The translation of the above statement is:
> 
>         abc = EXPR
>         exc = ()  # Or (None, None, None) ?
>         try:
>             try:
>                 VAR = abc.__enter__()
>                 BLOCK
>             except:
>                 exc = sys.exc_info()
>                 raise
>         finally:
>             abc.__exit__(exc)
> 
> I think you meant "abc.__exit__(*exc)". Assuming that, then "exc =
> (None, None, None)" makes the most sense. If exc_info() is going to be
> passed as a single arg, then I'd rather have the default "exc = ()", so
> I can simply check "if exc:" in the __exit__ method.

Also, the call to __enter__() needs to be before the try/finally block (as it is 
in PEP 310). Otherwise we get the "releasing a lock you failed to acquire" problem.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From ncoghlan at gmail.com  Sat May 14 11:37:36 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 14 May 2005 19:37:36 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc205051317133cf8fd63@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
Message-ID: <4285C6E0.90307@gmail.com>

Guido van Rossum wrote:
> I've written up the specs for my "PEP 340 redux" proposal as a
> separate PEP, PEP 343.
> 
> http://python.org/peps/pep-0343.html
> 
> Those who have been following the thread "Merging PEP 310 and PEP
> 340-redux?" will recognize my proposal in that thread, which received
> mostly positive responses there.
> 
> Please review and ask for clarifications of anything that's unclear.

On the keyword front, the two keyword choices affect the naming conventions of 
templates differently, and I think need to be considered in that light.

The naming convention for 'do' is shown in the current PEP 343. The issue I've 
noticed with it is that *functions* read well, but methods don't because things 
get out of sequence. That is, "do locking(the_lock)" reads well, but "do 
the_lock.locking()" does not.

Whereas, using 'with', it can be written either way, and still read reasonably 
well ("with locked(the_lock)", "with the_lock.locked()").

The 'with' keyword also reads better if objects natively support use in 'with' 
blocks ("with the_lock", "with the_file").

Guido's concern regarding file objects being reused inappropriately can be dealt 
with in the file __enter__ method:

   def __enter__(self):
       if self.closed:
           raise RuntimeError, "Cannot reopen closed file handle"

Template generators have the exact same problem with reusability - the solution 
used there is raising a RuntimeError when __enter__() is called inappropriately. 
This would make sense as a standard idiom - if a statement template can't be 
reused, attempting to do so should trigger a RuntimeError the second time 
__enter__() is invoked.

For files, it may then become the common practice to keep pathnames around, 
rather than open file handles. When you actually needed access to the file, the 
existing "open" builtin would suffice:

   with open(filename, "rb") as f:
       for line in f:
           print line

I've written out the PEP 343 examples below, assuming types acquire native with 
statement support (including Decimal contexts - I also give PEP 343 style code 
for Decimal contexts).

PEP343 examples: 'with' keyword, native support in objects

    1. A template for ensuring that a lock, acquired at the start of a
        block, is released when the block is left:

         # New methods on lock objects
             def __enter__(self):
                 self.acquire()

             def __exit__(self, *args):
                 self.release()

        Used as follows:

         with myLock:
             # Code here executes with myLock held.  The lock is
             # guaranteed to be released when the block is left (even
             # if via return or by an uncaught exception).

     2. A template for opening a file that ensures the file is closed
        when the block is left:

         # New methods on file objects
             def __enter__(self):
                 if self.closed:
                     raise RuntimeError, "Cannot reopen closed file handle"

             def __exit__(self, *args):
                 self.close()

        Used as follows:

         with open("/etc/passwd") as f:
             for line in f:
                 print line.rstrip()

     3. A template for committing or rolling back a database
        transaction; this is written as a class rather than as a
        decorator since it requires access to the exception information:

         class transaction:
             def __init__(self, db):
                 self.db = db
             def __enter__(self):
                 self.db.begin()
             def __exit__(self, *args):
                 if args and args[0] is not None:
                     self.db.rollback()
                 else:
                     self.db.commit()

       Used as follows:

         with transaction(db):
             # Exceptions in this code cause a rollback

     5. Redirect stdout temporarily:

         @with_template
         def redirected_stdout(new_stdout):
             save_stdout = sys.stdout
             sys.stdout = new_stdout
             yield None
             sys.stdout = save_stdout

        Used as follows:

         with open(filename, "w") as f:
             with redirected_stdout(f):
                 print "Hello world"

        This isn't thread-safe, of course, but neither is doing this
        same dance manually.  In a single-threaded program (e.g., a
        script) it is a totally fine way of doing things.

     6. A variant on opening() that also returns an error condition:

         @with_template
         def open_w_error(filename, mode="r"):
             try:
                 f = open(filename, mode)
             except IOError, err:
                 yield None, err
             else:
                 yield f, None
                 f.close()

        Used as follows:

         with open_w_error("/etc/passwd", "a") as f, err:
             if err:
                 print "IOError:", err
             else:
                 f.write("guido::0:0::/:/bin/sh\n")

     7. Another useful example would be an operation that blocks
        signals.  The use could be like this:

         from signal import blocked_signals

         with blocked_signals():
             # code executed without worrying about signals

        An optional argument might be a list of signals to be blocked;
        by default all signals are blocked.  The implementation is left
        as an exercise to the reader.

     8. Another use for this feature is the Decimal context.

         # New methods on decimal Context objects
             def __enter__(self):
                 self._old_context = getcontext()
                 setcontext(self)

             def __exit__(self, *args):
                 setcontext(self._old_context)

        Used as follows:

         with decimal.Context(precision=28):
             # Code here executes with the given context
             # The context always reverts after this statement

For comparison, the equivalent PEP 343 code is:

     @do_template
     def with_context(context):
         old_context = getcontext()
         setcontext(context)
         yield None
         setcontext(old_context)

    Used as:

         do decimal.with_context(decimal.Context(precision=28)):
             # Code here executes with the given context
             # The context always reverts after this statement

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From fredrik at pythonware.com  Sat May 14 12:08:29 2005
From: fredrik at pythonware.com (Fredrik Lundh)
Date: Sat, 14 May 2005 12:08:29 +0200
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
Message-ID: <d64imp$fai$1@sea.gmane.org>

Guido van Rossum wrote:

> I've written up the specs for my "PEP 340 redux" proposal as a
> separate PEP, PEP 343.
>
> http://python.org/peps/pep-0343.html
>
> Those who have been following the thread "Merging PEP 310 and PEP
> 340-redux?" will recognize my proposal in that thread, which received
> mostly positive responses there.
>
> Please review and ask for clarifications of anything that's unclear.

intuitively, I'm -1 on this proposal.

unlike the original design, all you get from this is
the ability to add try/finally blocks to your code
without ever writing a try/finally-clause (neither
in your code or in the block controller).  that
doesn't strike me as especially pythonic.

(neither does the argument that just because you
can use a mechanism to write inscrutable code,
such a mechanism must not be made available
feel right; Python's design has always been about
careful tradeoffs between policy and mechanism,
but this is too much policy for my taste.  the
original PEP 340 might have been too clever, but
this reduced version feels pretty pointless).

</F>




From ncoghlan at gmail.com  Sat May 14 12:27:43 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 14 May 2005 20:27:43 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <d64imp$fai$1@sea.gmane.org>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
Message-ID: <4285D29F.7080100@gmail.com>

Fredrik Lundh wrote:
> unlike the original design, all you get from this is
> the ability to add try/finally blocks to your code
> without ever writing a try/finally-clause (neither
> in your code or in the block controller).  that
> doesn't strike me as especially pythonic.

I think the key benefit relates to the fact that correctly written resource 
management code currently has to be split it into two pieces - the first piece 
before the try block (e.g. 'lock.acquire()', 'f = open()'), and the latter in 
the finally clause (e.g. 'lock.release()', 'f.close()').

PEP 343 (like PEP 310 before it) makes it possible to define the correct 
resource management *once*, and then invoke it via a 'with' (or 'do') statement.

Instead of having to check for "is this file closed properly?", as soon as you 
write or see "with open(filename) as f:", you *know* that that file is going to 
be closed correctly.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From mwh at python.net  Sat May 14 13:30:34 2005
From: mwh at python.net (Michael Hudson)
Date: Sat, 14 May 2005 12:30:34 +0100
Subject: [Python-Dev] Merging PEP 310 and PEP 340-redux?
In-Reply-To: <ca471dc205051308417c541292@mail.gmail.com> (Guido van Rossum's
	message of "Fri, 13 May 2005 08:41:50 -0700")
References: <ca471dc2050509215823876c50@mail.gmail.com>
	<79990c6b05051105361e7a9ba0@mail.gmail.com>
	<42827154.4040006@gmail.com>
	<d11dcfba05051114441404ef0a@mail.gmail.com>
	<ca471dc20505112028b263df6@mail.gmail.com> <4282D512.7060602@zope.com>
	<4edc17eb05051208341b6bf751@mail.gmail.com>
	<428437F2.9020409@canterbury.ac.nz>
	<ca471dc20505130305233b330e@mail.gmail.com>
	<2mpsvv5kxp.fsf@starship.python.net>
	<ca471dc205051308417c541292@mail.gmail.com>
Message-ID: <2mekca59k5.fsf@starship.python.net>

Guido van Rossum <gvanrossum at gmail.com> writes:

> [Michael Hudson, after much thinking aloud]

Yeah, sorry about that :)

>> Oh, I guess the point is that with a decorated generator you can yield
>> a value to be used as VAR, rather than just discarding the value as
>> here.  Hmm.
>
> Right. (I thought it was worth quoting this for the benefit of other
> who went down the same trail but didn't quite make it to this
> destination.)
>
>> If things were fiddled such that sys.exc_info() return non-Nones when
>> a finally clause is being executed because of an exception, we don't
>> really need this wart, do we?
>
> The problem is that sys.exc_info() almost always returns *something*
> -- it's usually the last exception that was *ever* caught, except in
> certain circumstances.

Yeah, OK.  I'll stop claiming to understand sys.exc_info() apart from
the simple cases.

> [Michael again]
>> Compare and contrast:
>> 
>> @template
>> def redirected_stdout(out):
>>     save_stdout = sys.stdout
>>     sys.stdout = out
>> 
>>     yield None
>> 
>>     sys.stdout = save_stdout
>> 
>> class redirected_stdout(object):
>> 
>>     def __init__(self, output):
>>         self.output = output
>> 
>>     def __enter__(self):
>>         self.save_stdout = sys.stdout
>>         sys.stdout = self.output
>> 
>>     def __exit__(self):
>>         sys.stdout = self.save_stdout
>> 
>> The former is shorter and contains less (well, no) 'self.'s, but I
>> think I find the latter somewhat clearer.
>
> Tastes differ. I think the generator wins; more so when there's more
> state to remember.

Certainly when there's more state to manage, yes.  But both will be
possible, so, *shrug*.  It's not a big deal.

> [Michael quoting Guido]
>> > The added complexity is caused by the need to separate VAR from EXPR
>> > so that a generator can be used. I personally like this separation; I
>> > actually like that the "anonymous block controller" is logically
>> > separate from the variable bound by the construct.
>> 
>> Nevertheless, I think I actually like this argument!
>
> (Repeated for the benefit of others.)

I guess this means PEP 310 can be retracted.

Finally, from PEP 343 rev 1.7:

        exc = ()  # Or (None, None, None) ?

The latter, please.

Cheers,
mwh

-- 
    . <- the point                                your article -> .
    |------------------------- a long way ------------------------|
                                       -- Christophe Rhodes, ucam.chat

From fredrik at pythonware.com  Sat May 14 13:55:13 2005
From: fredrik at pythonware.com (Fredrik Lundh)
Date: Sat, 14 May 2005 13:55:13 +0200
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
References: <ca471dc205051317133cf8fd63@mail.gmail.com><d64imp$fai$1@sea.gmane.org>
	<4285D29F.7080100@gmail.com>
Message-ID: <d64ous$v7a$1@sea.gmane.org>

Nick Coghlan wrote:

> I think the key benefit relates to the fact that correctly written resource
> management code currently has to be split it into two pieces - the first piece
> before the try block (e.g. 'lock.acquire()', 'f = open()'), and the latter in
> the finally clause (e.g. 'lock.release()', 'f.close()').
>
> PEP 343 (like PEP 310 before it) makes it possible to define the correct
> resource management *once*, and then invoke it via a 'with' (or 'do')
> statement.

sure, but even if you look at both the application code *and*
the resource management, there are no clues that the "with"
statement is really just a masked "try/finally" statement.  just
look at the generator example:

    acquire
    yield
    release

what in this snippet tells you that the "release" part will run even if
the external block raises an exception?  you could at least change
that to

    acquire
    try:
        yield
    finally:
        release

which would make it a lot more obvious what's going on here.

also, come to think of it, adding a new statement just to hide
try/finally statements is a waste of statement space.  why not
just enhance the existing try statement?  let

    try with opening(file) as f:
        body
    except IOError:
        deal with the error (you have to do this anyway)

behave like

    try:
        f = opening(file)
        try:
            try:
                body
            except:
                exc = sys.exc_info()
            else:
                exc = None
        finally:
            f.__cleanup__(exc)
    except IOError:
        deal with the error

and you're done.  (or you could use __enter__ and __exit__ instead,
which would give you a variation of PEP-343-as-I-understand-it)

compared to a separate statement, the worst that can happen here,
usage-wise, is that you'll end up adding an "except: raise" line here
and there to propagate real exceptions rather than dealing with them
in place.  on the other hand, for the cases where you actually want
to deal with the exceptions, you'll save a line of code.  I think that's
a net win.

but I still think that something closer to the original PEP 340 is a lot
more useful.

</F>




From p.f.moore at gmail.com  Sat May 14 15:08:30 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Sat, 14 May 2005 14:08:30 +0100
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <d64ous$v7a$1@sea.gmane.org>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org> <4285D29F.7080100@gmail.com>
	<d64ous$v7a$1@sea.gmane.org>
Message-ID: <79990c6b0505140608277706a7@mail.gmail.com>

On 5/14/05, Fredrik Lundh <fredrik at pythonware.com> wrote:
> Nick Coghlan wrote:
> 
> > PEP 343 (like PEP 310 before it) makes it possible to define the correct
> > resource management *once*, and then invoke it via a 'with' (or 'do')
> > statement.

This is probably the main point for me - encapsulate the try...finally
dance in such a way that the two parts are not separated by an
(arbitrarily long) chunk of code. I hated the equivalent dances in C
(usually malloc/free stuff in that case) and it feels awkward that
this is the one real ugliness of C that Python hasn't fixed for me :-)

> sure, but even if you look at both the application code *and*
> the resource management, there are no clues that the "with"
> statement is really just a masked "try/finally" statement.  just
> look at the generator example:
> 
>     acquire
>     yield
>     release
> 
> what in this snippet tells you that the "release" part will run even if
> the external block raises an exception?

I agree with this point, though. What I liked about the original PEP
340 was the fact that the generator was a template, with "yield"
acting as a "put the block here" placeholder.

> but I still think that something closer to the original PEP 340 is a lot
> more useful.

Overall, I'd agree. I'm still fond of the original PEP 340 in all its
glory - the looping issue was a wart, but PEP 343 seems so stripped
down as to be something entirely different, not just a fix to the
looping issue.

My view - PEP 343 get a +1 in preference to PEP 310.
I'd like to see PEP 342 - that gets a +1 from me.

Covering both these areas at once, PEP 340 would still probably be my
preference, though. (I'm not convinced there's much chance of
resurrecting it, though).

Even it its limited form, I prefer PEP 343 to the status quo, though.

Oh, and I'm leaning towards "with" as a keyword again, as a result of
the "works better with member functions" argument.

Paul.

From ncoghlan at gmail.com  Sat May 14 16:20:37 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 15 May 2005 00:20:37 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <79990c6b0505140608277706a7@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>
	<4285D29F.7080100@gmail.com>	<d64ous$v7a$1@sea.gmane.org>
	<79990c6b0505140608277706a7@mail.gmail.com>
Message-ID: <42860935.4080607@gmail.com>

Paul Moore wrote:
> On 5/14/05, Fredrik Lundh <fredrik at pythonware.com> wrote:
> 
>>Nick Coghlan wrote:
>>
>>
>>>PEP 343 (like PEP 310 before it) makes it possible to define the correct
>>>resource management *once*, and then invoke it via a 'with' (or 'do')
>>>statement.
> 
> 
> This is probably the main point for me - encapsulate the try...finally
> dance in such a way that the two parts are not separated by an
> (arbitrarily long) chunk of code. I hated the equivalent dances in C
> (usually malloc/free stuff in that case) and it feels awkward that
> this is the one real ugliness of C that Python hasn't fixed for me :-)
> 
> 
>>sure, but even if you look at both the application code *and*
>>the resource management, there are no clues that the "with"
>>statement is really just a masked "try/finally" statement.  just
>>look at the generator example:
>>
>>    acquire
>>    yield
>>    release
>>
>>what in this snippet tells you that the "release" part will run even if
>>the external block raises an exception?
> 
> 
> I agree with this point, though. What I liked about the original PEP
> 340 was the fact that the generator was a template, with "yield"
> acting as a "put the block here" placeholder.

I also think the "generator as statement template" works much better if the 
__exit__ method is able to inject the exception into the generator frame, rather 
than always calling next().

Maybe PEP 343 should drop any suggestion of using generators to define these 
things, and focus on the PEP 310 style templates.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From gvanrossum at gmail.com  Sat May 14 17:02:25 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Sat, 14 May 2005 08:02:25 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4285B7D5.1090704@gmail.com>
References: <3A81C87DC164034AA4E2DDFE11D258E37720A9@exchange.hqamor.amorhq.net>
	<4285B7D5.1090704@gmail.com>
Message-ID: <ca471dc2050514080260948597@mail.gmail.com>

[Nick Coghlan]
> Also, the call to __enter__() needs to be before the try/finally block (as it is
> in PEP 310). Otherwise we get the "releasing a lock you failed to acquire" problem.

I did that on purpose. There's a separate object ('abc' in the
pseudo-code of the translation) whose __enter__ and __exit__ methods
are called, and in __enter__ it can keep track of the reversible
actions it has taken.

Consider an application where you have to acquire *two* locks regularly:

    def lockBoth():
        got1 = got2 = False
        lock1.acquire(); got1 = True
        lock2.acquire(); got2 = True
        yield None
        if got2: lock2.release()
        if got1: lock1.release()

If this gets interrupted after locking lock1 but before locking lock2,
it still has some cleanup to do.

I know that this complicates simpler use cases, and I'm not 100% sure
this is the right solution; but I don't know how else to handle this
use case.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Sat May 14 17:12:32 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Sat, 14 May 2005 08:12:32 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <42860935.4080607@gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org> <4285D29F.7080100@gmail.com>
	<d64ous$v7a$1@sea.gmane.org>
	<79990c6b0505140608277706a7@mail.gmail.com>
	<42860935.4080607@gmail.com>
Message-ID: <ca471dc2050514081229edefd4@mail.gmail.com>

[Nick Coghlan]
> Maybe PEP 343 should drop any suggestion of using generators to define these
> things, and focus on the PEP 310 style templates.

But then the reason for separating VAR from EXPR becomes unclear.
Several people have mentioned that they thought this was "a good idea
on its own", but without giving additional use cases. Without the
ability to write the acquire/release template as a generator, the big
question is, "why not just PEP 310" ?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Sat May 14 19:01:01 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Sat, 14 May 2005 13:01:01 -0400
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <d64ous$v7a$1@sea.gmane.org>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org> <4285D29F.7080100@gmail.com>
Message-ID: <5.1.1.6.0.20050514124820.021d69a0@mail.telecommunity.com>

At 01:55 PM 5/14/2005 +0200, Fredrik Lundh wrote:
>also, come to think of it, adding a new statement just to hide
>try/finally statements is a waste of statement space.  why not
>just enhance the existing try statement?  let
>
>     try with opening(file) as f:
>         body
>     except IOError:
>         deal with the error (you have to do this anyway)
>
>behave like
>
>     try:
>         f = opening(file)
>         try:
>             try:
>                 body
>             except:
>                 exc = sys.exc_info()
>             else:
>                 exc = None
>         finally:
>             f.__cleanup__(exc)
>     except IOError:
>         deal with the error
>
>and you're done.  (or you could use __enter__ and __exit__ instead,
>which would give you a variation of PEP-343-as-I-understand-it)

I like this, if you take out the "with" part, change the method names to 
__try__ and __finally__, and allow "try" to work as a block on its own if 
you've specified an expression.  i.e.:

     try opening(filename) as f:
         # do stuff

     try locking(self.__lock):
         # do stuff

     try redirecting_stdout(f):
         # something

     try decimal.Context(precision=23):
         # okay, this one's a little weird

     try self.__lock:
         # and so's this; nouns read better with a gerund wrapper

and I'd make the translation be:

     try:
         __exc = ()
         VAR = EXPR.__try__()
         try:
             try:
                 BODY
             except:
                 __exc = sys.exc_info()
                 raise
         finally:
             EXPR.__finally__()

     # except/else/finally clauses here, if there were any in the original try


>but I still think that something closer to the original PEP 340 is a lot
>more useful.

I agree, but mainly because I'd like to be able to allow try/finally around 
"yield" in generators, be able to pass exceptions into generators, and tell 
generators to release their resources.  :)

I do think that the PEP 340 template concept is much more elegant than the 
various PEP 310-derived approaches, though.


From gvanrossum at gmail.com  Sat May 14 19:43:32 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Sat, 14 May 2005 10:43:32 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <d64imp$fai$1@sea.gmane.org>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
Message-ID: <ca471dc205051410435473d2b2@mail.gmail.com>

[Fredrik Lundh]
> intuitively, I'm -1 on this proposal.

So we need to do better. Do you just prefer all of PEP 340? What about
the objections against it? The mostly unnecessary loopiness in
particular?

> unlike the original design, all you get from this is
> the ability to add try/finally blocks to your code
> without ever writing a try/finally-clause (neither
> in your code or in the block controller).  that
> doesn't strike me as especially pythonic.

Would it be better if we pulled back in the generator exit handling
from PEP 340? That's a pretty self-contained thing, and would let you
write try/finally around the yield.

> (neither does the argument that just because you
> can use a mechanism to write inscrutable code,
> such a mechanism must not be made available
> feel right; Python's design has always been about
> careful tradeoffs between policy and mechanism,
> but this is too much policy for my taste.  the
> original PEP 340 might have been too clever, but
> this reduced version feels pretty pointless).

Maybe. It still solves the majority of use cases for PEP 340, most of
which are try/finally abstractions.

Maybe I'm overreacting to Raymond Chen's rant about flow-control
macros -- but having had to maintain code once that was riddled with
these, it rang very true.

PEP 340 is still my favorite, but it seems there's too much opposition
to it, so I'm trying to explore alternatives; at the same time I
*really* dislike the complexities of some of the non-looping
counterproposals (e.g. Nick Coghlan's PEP 3XX or the proposals that
make every keyword associated with 'try' a method).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From aahz at pythoncraft.com  Sat May 14 20:34:29 2005
From: aahz at pythoncraft.com (Aahz)
Date: Sat, 14 May 2005 11:34:29 -0700
Subject: [Python-Dev] Tidier Exceptions
In-Reply-To: <428437DE.60204@canterbury.ac.nz>
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>
	<ca471dc2050512160977cc60f@mail.gmail.com>
	<4283E9F6.1020500@ocf.berkeley.edu>
	<428437DE.60204@canterbury.ac.nz>
Message-ID: <20050514183429.GA12001@panix.com>

On Fri, May 13, 2005, Greg Ewing wrote:
> Brett C. wrote:
>> 
>> Seems like, especially if we require inheritance from a base
>> exception class in Python 3000, exceptions should have standard 'arg'
>> and 'traceback' attributes with a possible 'context' attribute (or
>> always a 'context' attribute set to None if not a chained exception).
>
> Instead of an 'args' attribute, I'd suggest that the constructor take
> keyword arguments and store them in corresponding attributes. Then
> interested parties could retrieve them by name instead of having to
> remember their positions in the args tuple of the exception class
> concerned.

Sounds reasonable, but it should be equally easy to handle::

    raise MyError, "message"
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"And if that makes me an elitist...I couldn't be happier."  --JMS

From bac at OCF.Berkeley.EDU  Sat May 14 20:57:35 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Sat, 14 May 2005 11:57:35 -0700
Subject: [Python-Dev] Chained Exceptions
In-Reply-To: <ca471dc205051315287d0c2d27@mail.gmail.com>
References: <Pine.LNX.4.58.0505121715360.14555@server1.LFW.org>	
	<ca471dc205051216121093961b@mail.gmail.com>	
	<4283EB2C.7040602@ocf.berkeley.edu>	
	<ca471dc2050512174318ba97a8@mail.gmail.com>	
	<Pine.LNX.4.58.0505121944440.14555@server1.LFW.org>	
	<ca471dc205051218316ca4c611@mail.gmail.com>	
	<42851388.7030407@ocf.berkeley.edu>
	<ca471dc205051315287d0c2d27@mail.gmail.com>
Message-ID: <42864A1F.8010206@ocf.berkeley.edu>

Guido van Rossum wrote:
> [Brett C.]
> 
>>Maybe, but as long as caught exceptions get cleared that should be an issue.
>>Would this be solved if, when an 'except' branch is exited, exceptions are
>>cleared?  So, in the above example, once the 'pass' is hit in catchit() no
>>exception is considered active any longer.  This could be done with a CLEAR_EXC
>>opcode very easily inserted at the end of an 'except' branch by the compiler.
> 
> 
> Sure, but that would be backwards incompatible.

Right.  None of what I am discussing here I would expect to be implemented any
sooner than Python 3000.

> There's plenty of code
> that expects sys.exc_info() to continue to return the caught exception
> *outside* the except block. This is all incredibly tricky, to some
> extent for backwards compatibility reasons (please read the source
> code for maintaining the exc_info data!).
> 
> In Python 3000, I think we can get rid of sys.exc_info() altogether
> once we place the traceback in the exception object as the 'traceback'
> attribute: if you want this info, all you need is write
> 
>     except SomeException, err:
>         # now type is err.__class__, value is err, and traceback is
> err.traceback.
> 

Right, that is kind of the end goal in my mind.

> If you want to have this with an "except:" clause, you can just catch
> 'Exception' or perhaps 'BaseException'. This isn't possible in Python
> 2.x since there's no single base class.
> 

Right.  Once again I am only thinking about Python 3000.

-Brett

From shane at hathawaymix.org  Sat May 14 21:05:19 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Sat, 14 May 2005 13:05:19 -0600
Subject: [Python-Dev] Python's Unicode width default (New Py_UNICODE doc)
In-Reply-To: <42851A4E.9020609@egenix.com>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>	<427BCD3B.1000201@egenix.com>	<427C029D.3090907@v.loewis.de>	<427D0E80.4080502@egenix.com>	<427DD4EF.4030109@v.loewis.de>	<427F9007.3070603@egenix.com>	<427FD6D8.2010003@v.loewis.de>	<428079B5.6010602@egenix.com>	<42810104.3090303@v.loewis.de>	<42848AA1.1020402@egenix.com>
	<42850BC6.4010907@v.loewis.de> <42851A4E.9020609@egenix.com>
Message-ID: <42864BEF.40502@hathawaymix.org>

M.-A. Lemburg wrote:
> It is important to be able to rely on a default that
> is used when no special options are given. The decision
> to use UCS2 or UCS4 is much too important to be
> left to a configure script.

Should the choice be a runtime decision?  I think it should be.  That
could mean two unicode types, a call similar to
sys.setdefaultencoding(), a new unicode extension module, or something else.

BTW, thanks for discussing these issues.  I tried to write a patch to
the unicode API documentation, but it's hard to know just what to write.
 I think I can say this: "sometimes your strings are UTF-16, so you're
working with code units that are not necessarily complete code points;
sometimes your strings are UCS4, so you're working with code units that
are also complete code points.  The choice between UTF-16 and UCS4 is
made at the time the Python interpreter is compiled and the default
choice varies by operating system and configuration."

Shane

From foom at fuhm.net  Sat May 14 21:11:38 2005
From: foom at fuhm.net (James Y Knight)
Date: Sat, 14 May 2005 15:11:38 -0400
Subject: [Python-Dev] Tidier Exceptions
In-Reply-To: <20050514183429.GA12001@panix.com>
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>
	<ca471dc2050512160977cc60f@mail.gmail.com>
	<4283E9F6.1020500@ocf.berkeley.edu>
	<428437DE.60204@canterbury.ac.nz>
	<20050514183429.GA12001@panix.com>
Message-ID: <E0777C58-5EA3-4F98-B7D0-C10330D05CED@fuhm.net>

On May 14, 2005, at 2:34 PM, Aahz wrote:
> On Fri, May 13, 2005, Greg Ewing wrote:
>
> Sounds reasonable, but it should be equally easy to handle::
>
>     raise MyError, "message"

Make that:
    raise MyError("message")

There's really no reason for a multi-argument raise when exceptions  
are objects anyhow.

James

From bac at OCF.Berkeley.EDU  Sat May 14 21:31:42 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Sat, 14 May 2005 12:31:42 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc205051410435473d2b2@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
Message-ID: <4286521E.5010703@ocf.berkeley.edu>

Guido van Rossum wrote:
> [Fredrik Lundh]
> 
>>intuitively, I'm -1 on this proposal.
> 

Just to toss in my opinion, I prefer PEP 340 over 343 as well, but not so much
to give 343 a -1 from me.

[SNIP - question of how to handle argument against 340 being a loop which I
never totally got since you know it ahead of time so you just deal with it]

[SNIP]
> Maybe I'm overreacting to Raymond Chen's rant about flow-control
> macros -- but having had to maintain code once that was riddled with
> these, it rang very true.
> 

You might be overreacting.  I read the essay and I understand his arguments
having been taught how to program in Scheme.  But I don't think this is as
serious.  People will have the semantics of the block statement explained to
them so how that works will be clear.  And at that point they just have to
track down the generator or iterator that the block statement is using.

If you think about it, how is it different than the implicit iter() call on a
'for' loop along with the implicit next() call each time through the loop?
Just because there is an implicit closing call back into the block generator at
the end of a block statement?  Doesn't seem so bad to me or that much of a
stretch from the magic of a 'for' loop to be that huge of a thing.

I think Raymond was reeling against arbitrary macro creation that hides flow
control.  We don't have that here.  What we have is a clearly defined statement
that does some very handy syntactic sugar for us.  It doesn't feel as arbitrary
as what Lisp and Scheme allow you to do.

> PEP 340 is still my favorite, but it seems there's too much opposition
> to it, so I'm trying to explore alternatives; at the same time I
> *really* dislike the complexities of some of the non-looping
> counterproposals (e.g. Nick Coghlan's PEP 3XX or the proposals that
> make every keyword associated with 'try' a method).
> 

Nick's was obviously directly against looping, but, with no offense to Nick,
how many other people were against it looping?  It never felt like it was a
screaming mass with pitchforks but more of a "I don't love it, but I can deal"
crowd.

And as for the overly complex examples, that I believed stemmed from people
realizing what might be possible if the statement was extended and tweaked this
way or that so as to be able to do just one more thing.  But that happens with
every proposal; seemed like standard feature creep.  The reason we have a BDFL
is to tell us that yes, we could get the jumbo sized candy bar for $2 more but
there is no way you will be able to finish that much chocolate before it melts
all over your hands and get it all over your nice PyCon t-shirt.

But then again I don't know if you got private emails asking to see if PEP 340
weighed as much as wood so it could be burned at the stake for being a witch.

-Brett

From bob at redivi.com  Sat May 14 21:39:17 2005
From: bob at redivi.com (Bob Ippolito)
Date: Sat, 14 May 2005 15:39:17 -0400
Subject: [Python-Dev] Python's Unicode width default (New Py_UNICODE doc)
In-Reply-To: <42864BEF.40502@hathawaymix.org>
References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com>	<2mhdhiyler.fsf@starship.python.net>	<19b53d82c6eb11419ddf4cb529241f64@opnet.com>	<427946B9.6070500@v.loewis.de>	<42794ABD.2080405@hathawaymix.org>	<94bf7ebbb72e749216d46a4fa109ffa2@opnet.com>	<427B1A06.4010004@egenix.com>	<1aa9dc22bc477128c9dfbbc8d0f1f3a5@opnet.com>	<427BCD3B.1000201@egenix.com>	<427C029D.3090907@v.loewis.de>	<427D0E80.4080502@egenix.com>	<427DD4EF.4030109@v.loewis.de>	<427F9007.3070603@egenix.com>	<427FD6D8.2010003@v.loewis.de>	<428079B5.6010602@egenix.com>	<42810104.3090303@v.loewis.de>	<42848AA1.1020402@egenix.com>
	<42850BC6.4010907@v.loewis.de> <42851A4E.9020609@egenix.com>
	<42864BEF.40502@hathawaymix.org>
Message-ID: <B8DF49B3-9B78-4101-874E-9CDFEA5E280A@redivi.com>


On May 14, 2005, at 3:05 PM, Shane Hathaway wrote:

> M.-A. Lemburg wrote:
>
>> It is important to be able to rely on a default that
>> is used when no special options are given. The decision
>> to use UCS2 or UCS4 is much too important to be
>> left to a configure script.
>>
>
> Should the choice be a runtime decision?  I think it should be.  That
> could mean two unicode types, a call similar to
> sys.setdefaultencoding(), a new unicode extension module, or  
> something else.
>
> BTW, thanks for discussing these issues.  I tried to write a patch to
> the unicode API documentation, but it's hard to know just what to  
> write.
>  I think I can say this: "sometimes your strings are UTF-16, so you're
> working with code units that are not necessarily complete code points;
> sometimes your strings are UCS4, so you're working with code units  
> that
> are also complete code points.  The choice between UTF-16 and UCS4 is
> made at the time the Python interpreter is compiled and the default
> choice varies by operating system and configuration."

Well, if you're going to make it runtime, you might as well do it  
right.  Take away the restriction that the unicode type backing store  
is forced to be a particular encoding (i.e. get rid of  
PyUnicode_AS_UNICODE) and give it more flexibility.

The implementation of NSString in OpenDarwin's libFoundation <http:// 
libfoundation.opendarwin.org/> (BSD license), or the CFString  
implementation in Apple's CoreFoundation <http://developer.apple.com/ 
darwin/cflite.html> (APSL) would be an excellent place to look for  
how this can be done.

Of course, for backwards compatibility reasons, this would have to be  
a new type that descends from basestring.  text would probably be a  
good name for it.  This would be an abstract implementation, where  
you can make concrete subclasses that actually implement the various  
operations as necessary.  For example, you could have text_ucs2,  
text_ucs4, text_ascii, text_codec, etc.

The bonus here is you can get people to shut up about space efficient  
representations, because you can use whatever makes sense.

-bob


From shane at hathawaymix.org  Sun May 15 01:10:20 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Sat, 14 May 2005 17:10:20 -0600
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc2050514080260948597@mail.gmail.com>
References: <3A81C87DC164034AA4E2DDFE11D258E37720A9@exchange.hqamor.amorhq.net>	<4285B7D5.1090704@gmail.com>
	<ca471dc2050514080260948597@mail.gmail.com>
Message-ID: <4286855C.6020103@hathawaymix.org>

Guido van Rossum wrote:
> [Nick Coghlan]
> 
>>Also, the call to __enter__() needs to be before the try/finally block (as it is
>>in PEP 310). Otherwise we get the "releasing a lock you failed to acquire" problem.
> 
> 
> I did that on purpose. There's a separate object ('abc' in the
> pseudo-code of the translation) whose __enter__ and __exit__ methods
> are called, and in __enter__ it can keep track of the reversible
> actions it has taken.
> 
> Consider an application where you have to acquire *two* locks regularly:
> 
>     def lockBoth():
>         got1 = got2 = False
>         lock1.acquire(); got1 = True
>         lock2.acquire(); got2 = True
>         yield None
>         if got2: lock2.release()
>         if got1: lock1.release()
> 
> If this gets interrupted after locking lock1 but before locking lock2,
> it still has some cleanup to do.

That code is incorrect, though.  Say lockBoth() acquires lock1 but then
lock2.acquire() throws an exception.  (Maybe the lock requires some I/O
operation, and the operation fails.)  The interpreter will never reach
the yield statement and lock1 will never be released.

You really have to write it like this:

    def lockBoth():
        lock1.acquire()
        try:
            lock2.acquire()
        except:
            lock1.release()
            raise
        yield None
        try:
            lock2.release()
        finally:
            lock1.release()

> I know that this complicates simpler use cases, and I'm not 100% sure
> this is the right solution; but I don't know how else to handle this
> use case.

If __enter__ raises an exception, it has to clean up after itself before
propagating the exception.  __exit__ shouldn't be called if __enter__ fails.

Shane

From shane at hathawaymix.org  Sun May 15 02:07:02 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Sat, 14 May 2005 18:07:02 -0600
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4286521E.5010703@ocf.berkeley.edu>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286521E.5010703@ocf.berkeley.edu>
Message-ID: <428692A6.8080201@hathawaymix.org>

Brett C. wrote:
> Guido van Rossum wrote:
>>PEP 340 is still my favorite, but it seems there's too much opposition
>>to it, so I'm trying to explore alternatives; at the same time I
>>*really* dislike the complexities of some of the non-looping
>>counterproposals (e.g. Nick Coghlan's PEP 3XX or the proposals that
>>make every keyword associated with 'try' a method).
>>
> 
> 
> Nick's was obviously directly against looping, but, with no offense to Nick,
> how many other people were against it looping?  It never felt like it was a
> screaming mass with pitchforks but more of a "I don't love it, but I can deal"
> crowd.

PEP 340 is very nice, but it became less appealing to me when I saw what
it would do to "break" and "continue" statements.

    text = 'diamond'
    for fn in filenames:
        opening(fn) as f:
            if text in f.read():
                print 'I found the text in %s' % fn
                break

I think it would be pretty surprising if the break didn't stop the loop.

Here's a new suggestion for PEP 340: use one keyword to start a block
you don't want to loop, and a different keyword to start a block that
can loop.  If you specify the non-looping keyword but the block template
produces more than one result, a RuntimeError results.  Here is example
A, a non-looping block statement using "try":

    text = 'diamond'
    for fn in filenames:
        try opening(fn) as f:
            if text in f.read():
                print 'I found the text in %s' % fn
                break

In example A, the break statement breaks the "for" loop.  If the
opening() iterator returns more than one result, a RuntimeError will be
generated by the Python interpreter.

Here is example B, a looping block statement using "in", adapted from
PEP 340:

    in auto_retry(3, IOError) as attempt:
        f = urllib.urlopen("http://python.org/peps/pep-0340.html")
        print f.read()

Note that I introduced no new keywords except "as", and the syntax in
both cases is currently illegal.

Shane

From ncoghlan at gmail.com  Sun May 15 03:12:41 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 15 May 2005 11:12:41 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc2050514080260948597@mail.gmail.com>
References: <3A81C87DC164034AA4E2DDFE11D258E37720A9@exchange.hqamor.amorhq.net>	
	<4285B7D5.1090704@gmail.com>
	<ca471dc2050514080260948597@mail.gmail.com>
Message-ID: <4286A209.5010307@gmail.com>

Guido van Rossum wrote:
> [Nick Coghlan]
> 
>>Also, the call to __enter__() needs to be before the try/finally block (as it is
>>in PEP 310). Otherwise we get the "releasing a lock you failed to acquire" problem.
> 
> 
> I did that on purpose. There's a separate object ('abc' in the
> pseudo-code of the translation) whose __enter__ and __exit__ methods
> are called, and in __enter__ it can keep track of the reversible
> actions it has taken.
> 
> Consider an application where you have to acquire *two* locks regularly:
> 
>     def lockBoth():
>         got1 = got2 = False
>         lock1.acquire(); got1 = True
>         lock2.acquire(); got2 = True
>         yield None
>         if got2: lock2.release()
>         if got1: lock1.release()
> 
> If this gets interrupted after locking lock1 but before locking lock2,
> it still has some cleanup to do.
> 
> I know that this complicates simpler use cases, and I'm not 100% sure
> this is the right solution; but I don't know how else to handle this
> use case.
> 

If we retained the ability to inject exceptions into generators, this would be 
written with the extremely natural:

   @with template:
   def lockboth():
     lock1.acquire()
     try:
         lock2.acquire()
         try:
             yield
         finally:
             lock2.release()
     finally:
         lock1.release()

Or, even more simply:

    @with_template:
    def lockboth():
        with lock1:
            with lock2:
                yield

I think Fredrik's intuition is on to something - PEP 343 has scaled the idea 
back *too* far by throwing away the injection of exceptions into generator 
templates, when the only major objection was to the looping nature of the proposal.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From python at rcn.com  Sun May 15 04:05:46 2005
From: python at rcn.com (Raymond Hettinger)
Date: Sat, 14 May 2005 22:05:46 -0400
Subject: [Python-Dev] [Python-checkins] python/nondist/peps pep-0343.txt,
	1.8, 1.9
In-Reply-To: <E1DX7GK-0004Nt-Pi@sc8-pr-cvs1.sourceforge.net>
Message-ID: <003301c558f2$9c0aa380$f5b69d8d@oemcomputer>

> -    was, under the covers, a (optential) looping construct.  This
> +    was, under the covers, a (potential) looping construct.  This

I'm glad I didn't fix this one.
I thought he meant to use "optional".


Raymond Hettinger

From python-dev at zesty.ca  Sun May 15 06:58:12 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Sat, 14 May 2005 23:58:12 -0500 (CDT)
Subject: [Python-Dev] PEP 343: Resource Composition and Idempotent __exit__
In-Reply-To: <4286855C.6020103@hathawaymix.org>
References: <3A81C87DC164034AA4E2DDFE11D258E37720A9@exchange.hqamor.amorhq.net>
	<4285B7D5.1090704@gmail.com>
	<ca471dc2050514080260948597@mail.gmail.com>
	<4286855C.6020103@hathawaymix.org>
Message-ID: <Pine.LNX.4.58.0505141826570.14555@server1.LFW.org>

Yikes, this turned out to be rather long.  Here's a summary:

  - The expansion suggested by Shane (moving __enter__ outside the
    try-finally) fits the expectation that __exit__ will always be
    paired with a successful __enter__.  This "paired-exit" expansion
    is fairly intuitive and makes simple resources easy to write,
    but they are not very composable.

  - The expansion that Guido currently has in PEP 343 encourages an
    implementation style where __exit__ is idempotent.  If we use it,
    we should document this fact, since it may seem a little unusual
    at first; we should also rename "enter"/"exit" so they do not
    mislead programmers into believing that they are paired.  Simple
    resources are a little more work to write than with a paired-exit
    expansion, but they are easier to compose and reuse.

  - The generator style in PEP 340 is the easiest to compose and
    reuse, but its implementation is the most complex to understand.

I lean (but only slightly) toward the second option, because it seems
to be a reasonable compromise.  The increased complexity of writing
resources for PEP 343 over PEP 340 becomes less of an issue if we have
a good do_template function in the standard library; on the other hand,
the use of do_template may complicate debugging.

For idempotent-exit, possible renamings of enter/exit might be
enter/finish, enter/cleanup, enter/finally, start/finally, begin/finally.

                            *       *       *

Okay.  Here's how i arrived at the above conclusions.  (In the following,
i'll just use "with" for "do/with".)

PEP 343 (rev 1.8) currently expands

    with EXPR as VAR:
        BLOCK

to this, which i'll call the "idempotent-exit" expansion:

    resource = EXPR
    exc = (None, None, None)
    try:
        try:
            VAR = resource.__enter__()
            BLOCK
        except:
            exc = sys.exc_info()
            raise
    finally:
        resource.__exit__(*exc)

If there are problems during __enter__, then __enter__ is expected to
record this fact so that __exit__ can clean up.  Since __exit__ is
called regardless of whether __enter__ succeeded, this encourages a
style of writing resources where __exit__ is idempotent.

An alternative, advocated by Shane (and by my first instincts), is this
expansion, which i'll call the "paired-exit" expansion:

    resource = EXPR
    exc = (None, None, None)
    VAR = resource.__enter__()
    try:
        try:
            BLOCK
        except:
            exc = sys.exc_info()
            raise
    finally:
        resource.__exit__(*exc)

If there are problems during __enter__, __enter__ must clean them up
before propagating an exception, because __exit__ will not be called.

To evaluate these options, we could look at a few scenarios where we're
trying to write a resource wrapper for some lock objects.  Each lock
object has two methods, .acquire() and .release().

    Scenario 1. You have two resource objects and you want to acquire both.

    Scenario 2. You want a single resource object that acquires two locks.

    Scenario 3. Your resource object acquires one of two locks depending
        on some runtime condition.


Scenario 1 (Composition by client)
==================================

The client writes this:

    with resource1:
        with resource2:
            BLOCK

The idempotent-exit expansion would yield this:

    exc1 = (None, None, None)
    try:
        try:
            resource1.__enter__()
            exc2 = (None, None, None)
            try:
                try:
                    resource2.__enter__()
                    BLOCK
                except:
                    exc2 = sys.exc_info()
                    raise
            finally:
                resource2.__exit__(*exc2)
        except:
            exc1 = sys.exc_info()
            raise
    finally:
        resource1.__exit__(*exc1)

Because __exit__ is always called even if __enter__ fails, the resource
wrapper must record whether __enter__ succeeded:

    class ResourceI:
        def __init__(self, lock):
            self.lock = lock
            self.acquired = False

        def __enter__(self):
            self.lock.acquire()
            self.acquired = True

        def __exit__(self, *exc):
            if self.acquired:
                self.lock.release()
                self.acquired = False

The paired-exit expansion would yield this:

    exc1 = (None, None, None)
    resource1.__enter__()
    try:
        try:
            exc2 = (None, None, None)
            resource2.__enter__()
            try:
                try:
                    BLOCK
                except:
                    exc2 = sys.exc_info()
                    raise
            finally:
                resource2.__exit__(*exc2)
        except:
            exc1 = sys.exc_info()
            raise
    finally:
        resource1.__exit__(*exc1)

In this case the lock can simply be implemented as:

    class ResourceP:
        def __init__(self, lock):
            self.lock = lock

        def __enter__(self):
            self.lock.acquire()

        def __exit__(self, *exc):
            self.lock.release()

With PEP 340, assuming no return values and the presence of an __exit__
method, we would get this expansion:

    exc1 = None
    while True:
        try:
            if exc1:
                resource1.__exit__(*exc1)         # may re-raise *exc1
            else:
                resource1.next()                  # may raise StopIteration
        except StopIteration:
            break
        try:
            exc1 = None

            exc2 = None
            while True:
                try:
                    if exc2:
                        resource2.__exit__(*exc2) # may re-raise *exc2
                    else:
                        resource2.next()          # may raise StopIteration
                except StopIteration:
                    break
                try:
                    exc2 = None
                    BLOCK
                except:
                    exc2 = sys.exc_info()

        except:
            exc1 = sys.exc_info()

Assuming that the implementations of resource1 and resource2 invoke
'yield' exactly once, this reduces to:

    exc1 = None
    resource1.next()               # first time, will not raise StopIteration
    try:
        exc2 = None
        resource2.next()           # first time, will not raise StopIteration
        try:
            BLOCK
        except:
            exc2 = sys.exc_info()
        try:
            if exc2:
                resource2.__exit__(*exc2)
            else:
                resource2.next()   # second time, will raise StopIteration
        except StopIteration:
            pass
    except:
        exc1 = sys.exc_info()
    try:
        if exc1:
            resource1.__exit__(*exc1)
        else:
            resource1.next()       # second time, will raise StopIteration
    except StopIteration:
        pass

For this expansion, it is sufficient to implement the resource as:

    def ResourceG(lock):
        lock.acquire()
        try:
            yield
        finally:
            lock.release()


Scenario 2 (Composition by implementor)
=======================================

The client writes this:

    with DoubleResource(lock1, lock2):
        BLOCK

With the idempotent-exit expansion, we could implement DoubleResource
directly like this:

    class DoubleResourceI:
        def __init__(self, lock1, lock2):
            self.lock1, self.lock2 = lock1, lock2
            self.got1 = self.got2 = False

        def __enter__(self):
            self.lock1.acquire()
            self.got1 = True
            try:
                self.lock2.acquire()
                self.got2 = True
            except:
                self.lock1.release()
                self.got1 = False

        def __exit__(self, *exc):
            try:
                if self.got2:
                    self.lock2.release()
                    self.got2 = False
            finally:
                if self.got1:
                    self.lock1.release()
                    self.got1 = False

or it could be implemented in terms of ResourceA like this:

    class DoubleResourceI:
        def __init__(self, lock1, lock2):
            self.resource1 = ResourceI(lock1)
            self.resource2 = ResourceI(lock2)

        def __enter__(self):
            self.resource1.__enter__()
            self.resource2.__enter__()

        def __exit__(self, *exc):
            try:
                self.resource2.__exit__()
            finally:
                self.resource1.__exit__()

On the other hand, if we use the paired-exit expansion, the
DoubleResource would be implemented like this:

    class DoubleResourceP:
        def __init__(self, lock1, lock2):
            self.lock1, self.lock2 = lock1, lock2

        def __enter__(self):
            self.lock1.acquire()
            try:
                self.lock2.acquire()
            except:
                self.lock1.release()
                raise

        def __exit__(self):
            try:
                self.lock2.release()
            finally:
                self.lock1.release()

As far as i can tell, the implementation of DoubleResourceP is
made no simpler by the definition of ResourceP.

With PEP 340, the DoubleResource could be written directly like this:

    def DoubleResourceG(lock1, lock2):
        lock1.acquire()
        try:
            lock2.acquire()
        except:
            lock1.release()
            raise
        try:
            yield
        finally:
            try:
                lock2.release()
            finally:
                lock1.release()

Or, if ResourceG were already defined, it could simply be written:

    def DoubleResourceG(lock1, lock2):
        with ResourceG(lock1):
            with ResourceG(lock2):
                yield

This should also work with PEP 343 if decorated with "@do_template",
though i don't have the patience to verify that carefully.  When
written this way, the Boolean flags disappear, as their purpose is
replaced by the internal generator state.


Scenario 3 (Conditional acquisition)
====================================

The client writes this:

    with ConditionalResource(condition, lock1, lock2):
        BLOCK

For the idempotent-exit expansion, we could implement ConditionalResource
directly like this:

    class ConditionalResourceI:
        def __init__(self, condition, lock1, lock2):
            self.condition = condition
            self.lock1, self.lock2 = lock1, lock2
            self.got1 = self.got2 = False

        def __enter__(self):
            if self.condition():
                self.lock1.acquire()
                self.got1 = True
            else:
                self.lock2.acquire()
                self.got2 = True

        def __exit__(self, *exc):
            try:
                if self.got1:
                    self.lock1.release()
                    self.got1 = False
            finally:
                if self.got2:
                    self.lock2.release()
                    self.got2 = False

Or we could implement it more simply in terms of ResourceI like this:

    class ConditionalResourceI:
        def __init__(self, condition, lock1, lock2):
            self.condition = condition
            self.resource1 = ResourceI(lock1)
            self.resource2 = ResourceI(lock2)

        def __enter__(self):
            if self.condition():
                self.resource1.__enter__()
            else:
                self.resource2.__enter__()

        def __exit__(self, *exc):
            try:
                self.resource2.__exit__()
            finally:
                self.resource1.__exit__()

For the paired-exit expansion, we would implement ConditionalResource
directly like this:

    class ConditionalResourceP:
        def __init__(self, condition, lock1, lock2):
            self.condition = condition
            self.lock1, self.lock2 = lock1, lock2
            self.flag = None

        def __enter__(self):
            self.flag = self.condition()
            if self.flag:
                self.lock1.acquire()
            else:
                self.lock2.acquire()

        def __exit__(self, *exc):
            if self.flag:
                self.lock1.release()
            else:
                self.lock2.release()

And using PEP 340, we would write it as a generator like this:

    def ConditionalResourceG(condition, lock1, lock2):
        if condition:
            with ResourceG(lock1):
                yield
        else:
            with ResourceG(lock2):
                yield

Again, i would expect this to also work with PEP 343 if "@do_template"
were inserted in front.


-- ?!ng

From ncoghlan at gmail.com  Sun May 15 09:22:51 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 15 May 2005 17:22:51 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc205051410435473d2b2@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
Message-ID: <4286F8CB.4090001@gmail.com>

Guido van Rossum wrote:
> [Fredrik Lundh]
> 
>>unlike the original design, all you get from this is
>>the ability to add try/finally blocks to your code
>>without ever writing a try/finally-clause (neither
>>in your code or in the block controller).  that
>>doesn't strike me as especially pythonic.
> 
> Would it be better if we pulled back in the generator exit handling
> from PEP 340? That's a pretty self-contained thing, and would let you
> write try/finally around the yield.

That would be good, in my opinion. I updated PEP 3XX to use this idea:
http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

With that update (to version 1.6), PEP 3XX is basically PEP 343, but injecting 
exceptions that occur into the template generator's internal frame instead of 
invoking next().

The rest of the PEP is then about dealing with the implications of allowing 
yield inside try/finally statements.

The Rejected Options section tries to look at all the alternatives brought up in 
the various PEP 310, 340 and 343 discussions, and explain why PEP 3XX chooses 
the way it does.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From sc at eyemagnet.net  Sun May 15 09:56:24 2005
From: sc at eyemagnet.net (Steve Castellotti)
Date: Sun, 15 May 2005 19:56:24 +1200
Subject: [Python-Dev] Loading compiled modules under MSYS/MingGW?
Message-ID: <20050515195624.m1bkd9x3lwc84gkc@www.eyemagnet.net>

Hey all-

   Simple question. I'm working on getting Python support enabled for the Gimp
under Win32. I've set up and successfully compiled the Gimp under a MSYS/MinGW
enviroment, using both python.org Python 2.3 and ActivePython 2.3 (I need to
stick to 2.3 due to support issues with the packaging software I'm using)

   Both versions of Python give me the same problem when I try to load the
"gimpfu" module (used for accessing Gimp functions from within Python programs)


Sample Script:

import sys
sys.path.insert(1, '/local/lib/gimp/2.0/python/')
import gimpfu
print "cool"



Execution:

$ ./test
Traceback (most recent call last):
  File "./test", line 7, in ?
    import gimpfu
  File "E:\msys\1.0\local\lib\gimp\2.0\python\gimpfu.py", line 65, in ?
    import gimp
ImportError: No module named gimp



Directory listing:

$ ls /local/lib/gimp/2.0/python/
gimpenums.py  gimpmodule.a   gimpprocbrowsermodule.a   gimpui.py
gimpfu.py     gimpmodule.la  gimpprocbrowsermodule.la  pygimp-logo.png
gimpfu.pyc    gimpplugin.py  gimpshelf.py



   ...Python is finding gimpfu.py, but doesn't see "gimpmodule.[a,la]" .. this
directory listing is similar to what you would find under a linux compilation,
except linux also creates a "gimpmodule.so" library file.


    Am I missing something obvious? Is this a question better suited to
MinGW/MSYS mailing lists, or perhaps the Gimp list? If so, what's the right
kind of question(s) I need to be asking? It seems to me that the compile is
going through just fine, but somethings not getting built correctly and Python
can't see that module.


Cheers in advance,

Steve Castellotti


From shane at hathawaymix.org  Sun May 15 10:42:46 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Sun, 15 May 2005 02:42:46 -0600
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4286F8CB.4090001@gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286F8CB.4090001@gmail.com>
Message-ID: <42870B86.9070605@hathawaymix.org>

Nick Coghlan wrote:
> That would be good, in my opinion. I updated PEP 3XX to use this idea:
> http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
> 
> With that update (to version 1.6), PEP 3XX is basically PEP 343, but injecting 
> exceptions that occur into the template generator's internal frame instead of 
> invoking next().
> 
> The rest of the PEP is then about dealing with the implications of allowing 
> yield inside try/finally statements.

You might add to the PEP the following example, which could really
improve the process of building GUIs in Python:

    class MyFrame(Frame):
        def __init__(self):
            with Panel():
                with VerticalBoxSizer():
                    self.text = TextEntry()
                    self.ok = Button('Ok')

Indentation improves the readability of code that creates a hierarchy.

Shane

From eric.nieuwland at xs4all.nl  Sun May 15 11:12:58 2005
From: eric.nieuwland at xs4all.nl (Eric Nieuwland)
Date: Sun, 15 May 2005 11:12:58 +0200
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <428692A6.8080201@hathawaymix.org>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286521E.5010703@ocf.berkeley.edu>
	<428692A6.8080201@hathawaymix.org>
Message-ID: <39ba574ece1a11f66038ac1048217916@xs4all.nl>

Shane Hathaway wrote:
> Here is example A, a non-looping block statement using "try":
>
>     text = 'diamond'
>     for fn in filenames:
>         try opening(fn) as f:
>             if text in f.read():
>                 print 'I found the text in %s' % fn
>                 break

That's a pretty way to write it!
Would it be possible to extend the 'try' syntax in this way?
It would certainly stress the fact that this construct includes 
exception handling.

--eric


From ncoghlan at gmail.com  Sun May 15 13:39:51 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 15 May 2005 21:39:51 +1000
Subject: [Python-Dev] PEP 343: Resource Composition and Idempotent
	__exit__
In-Reply-To: <Pine.LNX.4.58.0505141826570.14555@server1.LFW.org>
References: <3A81C87DC164034AA4E2DDFE11D258E37720A9@exchange.hqamor.amorhq.net>
	<4285B7D5.1090704@gmail.com>
	<ca471dc2050514080260948597@mail.gmail.com>
	<4286855C.6020103@hathawaymix.org>
	<Pine.LNX.4.58.0505141826570.14555@server1.LFW.org>
Message-ID: <42873507.8050600@gmail.com>

Ka-Ping Yee wrote:
>   - The generator style in PEP 340 is the easiest to compose and
>     reuse, but its implementation is the most complex to understand.

The latest version of my PEP 3XX aims to get (most of) the power of PEP 340, 
with the easy comprehensibility of PEP 310. What magic it requires is almost 
entirely contained in the statement_template decorator.

It can be looked at as PEP 340 without the looping or ability to suppress 
exceptions, or as PEP 343, with PEP 340's style of using generators to write 
templates.

It falls into the category where __enter__ and __exit__ are paired, as it uses 
the same expansion as Shane describes (an exception in __enter__ means that 
__exit__ is never executed).


  > To evaluate these options, we could look at a few scenarios where we're
> trying to write a resource wrapper for some lock objects.  Each lock
> object has two methods, .acquire() and .release().
> 
>     Scenario 1. You have two resource objects and you want to acquire both.
> 
>     Scenario 2. You want a single resource object that acquires two locks.
> 
>     Scenario 3. Your resource object acquires one of two locks depending
>         on some runtime condition.

For your three scenarios, PEP 3XX usage and implementation are as you describe 
for PEP 343. PEP 343 itself doesn't work as you describe, as it still prohibits 
yielding inside a try/finally block (and, by extension, inside a with statement, 
as that is just syntactic sugar for a particular type of try/finally).

Scenario 1 (two locks, handled manually):

PEP 3XX actually recommends supplying __enter__ and __exit__ directly on lock 
objects, so no additional 'resource' wrapper is required:

     with lock1:
         with lock2:
             BLOCK

And the relevant lock methods are:

     def __enter__(self):
         self.acquire()

     def __exit__(self, *exc):
         self.release()

However, if that didn't happen, and an external wrapper was needed, it could be 
optimally implemented as:

     class Resource(object):
         def __init__(self, lock):
             self.lock = lock

         def __enter__(self):
             self.lock.acquire()

         def __exit__(self, *exc):
             self.lock.release()

Or less efficiently as:

     @statement_template
     def Resource(lock):
         lock.acquire()
         try:
             yield
         finally:
             lock.release()

Scenario 2 (two locks, handled by resource):

Used as:

     with DoubleResource(lock1, lock2):
         BLOCK


Implemented as:

     @statement_template
     def DoubleResource(resource1, resource2):
         with resource1:
             with resource2:
                 yield

Scenario 3 (runtime choice of lock):

Used as:

     with ConditionalResource(condition, lock1, lock2):
         BLOCK

Implemented as:

     @statement_template
     def ConditionalResource(condition, resource1, resource2):
         if condition:
             with resource1:
                 yield
         else:
             with resource2:
                 yield

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From andersjm at inbound.dk  Sun May 15 13:42:00 2005
From: andersjm at inbound.dk (Anders J. Munch)
Date: Sun, 15 May 2005 13:42:00 +0200
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
References: <3A81C87DC164034AA4E2DDFE11D258E37720A9@exchange.hqamor.amorhq.net>	<4285B7D5.1090704@gmail.com><ca471dc2050514080260948597@mail.gmail.com>
	<4286855C.6020103@hathawaymix.org>
Message-ID: <009001c55943$33468e30$9a7cfea9@maxine>

Shane Hathaway wrote:
> Guido van Rossum wrote:
> >
> > Consider an application where you have to acquire *two* locks regularly:
> >
> You really have to write it like this:

Shane, you've already solved this one more elegantly:

def lockBoth():
    return combining(lock1.locking(), lock2.locking())

using the combining function you wrote earlier, which I assume will
make it into the library.

- Anders


From p.f.moore at gmail.com  Sun May 15 15:25:00 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Sun, 15 May 2005 14:25:00 +0100
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4286521E.5010703@ocf.berkeley.edu>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286521E.5010703@ocf.berkeley.edu>
Message-ID: <79990c6b05051506251712ad05@mail.gmail.com>

On 5/14/05, Brett C. <bac at ocf.berkeley.edu> wrote:
> Nick's was obviously directly against looping, but, with no offense to Nick,
> how many other people were against it looping?  It never felt like it was a
> screaming mass with pitchforks but more of a "I don't love it, but I can deal"
> crowd.

Agreed. That's certainly how I felt originally.

There were a *lot* of nice features with PEP 340. The initial
discussion had a lot of people enthusiastic about all the neat things
they could do with it. That's disappeared now, in a long series of
attempts to "fix" the looping issue. No-one is looking at PEP 343, or
Nick's PEP 3XX, and saying "hey, that's neat - I can do XXX with
that!". This makes me feel that we've thrown out the baby with the
bathwater. (Yes, I know PEP 342 is integral to many of the neat
features, but I get the impression that PEP 342 is being lost - later
iterations of the other two PEPs are going out of their way to avoid
assuming PEP 324 is implemented...)

Looping is definitely a wart. Looping may even be a real problem in
some cases. There may be cases where an explicit try...finally remains
better, simply to avoid an unwanted looping behaviour.

But I'll live with that to get back the enthusiasm for a new feature
that started all of this. Much better than the current "yes, I guess
that's good enough" tone to the discussion.

Paul.

PS Guido - next time you get a neat idea like PEP 340, just code it
and check it in. Then we can just badger you to fix the code, rather
than using up all your time on discussion before there's an
implementation :-)

From steven.bethard at gmail.com  Sun May 15 16:15:27 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Sun, 15 May 2005 08:15:27 -0600
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4286F8CB.4090001@gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286F8CB.4090001@gmail.com>
Message-ID: <d11dcfba050515071553dca64c@mail.gmail.com>

On 5/15/05, Nick Coghlan <ncoghlan at gmail.com> wrote:
> http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

>From there I see the semantics:

VAR1 = stmt_enter() # Omit 'VAR1 =' if no 'as' clause
exc = (None, None, None)
try:
    try:
        BLOCK1
    except:
        exc = sys.exc_info()
finally:
    stmt_exit(*exc)

Don't you want a "raise" after the "exc = sys.exc_info()"?

STeVe
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From tjreedy at udel.edu  Sun May 15 17:43:43 2005
From: tjreedy at udel.edu (Terry Reedy)
Date: Sun, 15 May 2005 11:43:43 -0400
Subject: [Python-Dev] Loading compiled modules under MSYS/MingGW?
References: <20050515195624.m1bkd9x3lwc84gkc@www.eyemagnet.net>
Message-ID: <d67qmu$fej$1@sea.gmane.org>


"Steve Castellotti" <sc at eyemagnet.net> wrote in message 
news:20050515195624.m1bkd9x3lwc84gkc at www.eyemagnet.net...
>   Simple question. I'm working on getting Python [2.3] support enabled
> for the Gimp under Win32.
...
>    Am I missing something obvious? Is this a question better suited to
> MinGW/MSYS mailing lists, or perhaps the Gimp list?

Since this is a question about using past Python [2.3], while this is a 
list for developing future Pythons [2.5+], the Python part of your question 
would better be directed to comp.lang.python or the corresponding mailing 
list.  Whether the other lists would be any better, I have no opinion.

Terry J. Reedy




From jcarlson at uci.edu  Sun May 15 17:52:47 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Sun, 15 May 2005 08:52:47 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <42870B86.9070605@hathawaymix.org>
References: <4286F8CB.4090001@gmail.com> <42870B86.9070605@hathawaymix.org>
Message-ID: <20050515083625.7626.JCARLSON@uci.edu>


Shane Hathaway <shane at hathawaymix.org> wrote:
> 
> Nick Coghlan wrote:
> > That would be good, in my opinion. I updated PEP 3XX to use this idea:
> > http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
> > 
> > With that update (to version 1.6), PEP 3XX is basically PEP 343, but injecting 
> > exceptions that occur into the template generator's internal frame instead of 
> > invoking next().
> > 
> > The rest of the PEP is then about dealing with the implications of allowing 
> > yield inside try/finally statements.
> 
> You might add to the PEP the following example, which could really
> improve the process of building GUIs in Python:
> 
>     class MyFrame(Frame):
>         def __init__(self):
>             with Panel():
>                 with VerticalBoxSizer():
>                     self.text = TextEntry()
>                     self.ok = Button('Ok')
> 
> Indentation improves the readability of code that creates a hierarchy.

I've generally been fairly ambiguous about the entire PEP 310/340/343
issue.  While resource allocation and release is quite useful, not a
whole heck of a lot of my code has to deal with it.

But after seeing this sketch of an example for its use in GUI
construction; I must have it!

+1 for:
 - 'do' keyword (though 'try' being reused scratches the 'no new keyword'
itch for me, and is explicit about "I won't loop")
 - PEP 343
 - try/finally + @do_template with generators


 - Josiah


From steven.bethard at gmail.com  Sun May 15 19:27:24 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Sun, 15 May 2005 11:27:24 -0600
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <79990c6b05051506251712ad05@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286521E.5010703@ocf.berkeley.edu>
	<79990c6b05051506251712ad05@mail.gmail.com>
Message-ID: <d11dcfba050515102736cdd7da@mail.gmail.com>

On 5/15/05, Paul Moore <p.f.moore at gmail.com> wrote:
> There were a *lot* of nice features with PEP 340. The initial
> discussion had a lot of people enthusiastic about all the neat things
> they could do with it. That's disappeared now, in a long series of
> attempts to "fix" the looping issue.

Having done the python-dev summary on this topic, I think the initial
"enthusiasm" you were seeing included a lot of "what if we did it this
way?" or "what if we extended this further in another way?" kind of
stuff.  When PEP 340 finally came out (around the end of the month),
the more extreme ideas were discarded.  So in some sense, PEP 340 was
the reason for the lack of "enthusiasm"; with the semantics laid out,
people were forced to deal with a specific implementation instead of a
variety of wild suggestions.

> No-one is looking at PEP 343, or
> Nick's PEP 3XX, and saying "hey, that's neat - I can do XXX with
> that!". This makes me feel that we've thrown out the baby with the
> bathwater.

I'd be surprised if you can find many examples that PEP 340 can do
that PEP 3XX can't.  The only real looping example we had was
auto_retry, and there's a reasonably simple solution to that in PEP
3XX.  You're not going to see anyone saying "hey that's neat - I can
do XXX with that!" because PEP 3XX doesn't add anything.  But for 95%
of the cases, it doesn't take anything away either.

> (Yes, I know PEP 342 is integral to many of the neat
> features, but I get the impression that PEP 342 is being lost - later
> iterations of the other two PEPs are going out of their way to avoid
> assuming PEP 324 is implemented...)

Not very far out of their way.  I split off PEP 342 from the original
PEP 340, and it was ridiculously easy for two reasons:
 * the concepts are very orthogonal; the only thing really used from
it in any of PEP 340 was the "yield" means "yield None" thing
 * there weren't *any* examples of using the "continue EXPR" syntax. 
PEP 342 is still lacking in this spot

If you want to get people enthused about PEP 342 again (which is the
right way to make sure it gets excepted), what would really help is a
bunch of good examples of how it could be used.

> Looping is definitely a wart. Looping may even be a real problem in
> some cases. There may be cases where an explicit try...finally remains
> better, simply to avoid an unwanted looping behaviour.
> 
> But I'll live with that to get back the enthusiasm for a new feature
> that started all of this. Much better than the current "yes, I guess
> that's good enough" tone to the discussion.

I'm convinced that such a tone is inevitable after 30 days and over
700 messages on *any* topic. ;-)

Ok, back to summarizing this fortnight's 380+ PEP 340 messages. ;-)

STeVe
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From bac at OCF.Berkeley.EDU  Sun May 15 22:51:13 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Sun, 15 May 2005 13:51:13 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <79990c6b05051506251712ad05@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	
	<d64imp$fai$1@sea.gmane.org>	
	<ca471dc205051410435473d2b2@mail.gmail.com>	
	<4286521E.5010703@ocf.berkeley.edu>
	<79990c6b05051506251712ad05@mail.gmail.com>
Message-ID: <4287B641.40002@ocf.berkeley.edu>

Paul Moore wrote:
> On 5/14/05, Brett C. <bac at ocf.berkeley.edu> wrote:
> 
>>Nick's was obviously directly against looping, but, with no offense to Nick,
>>how many other people were against it looping?  It never felt like it was a
>>screaming mass with pitchforks but more of a "I don't love it, but I can deal"
>>crowd.
> 
> 
> Agreed. That's certainly how I felt originally.
> 

Oh good.  So I am not nuts.  =)

> There were a *lot* of nice features with PEP 340. The initial
> discussion had a lot of people enthusiastic about all the neat things
> they could do with it. That's disappeared now, in a long series of
> attempts to "fix" the looping issue. No-one is looking at PEP 343, or
> Nick's PEP 3XX, and saying "hey, that's neat - I can do XXX with
> that!". This makes me feel that we've thrown out the baby with the
> bathwater. (Yes, I know PEP 342 is integral to many of the neat
> features, but I get the impression that PEP 342 is being lost - later
> iterations of the other two PEPs are going out of their way to avoid
> assuming PEP 324 is implemented...)
> 

My feelings exactly.  I was really happy and excited when it seemed like
everyone really liked PEP 340 sans a few disagreements on looping and other
things.  Having a huge chunk of people get excited and liking a proposal was a
nice contrast to the whole decorator debate.

> Looping is definitely a wart. Looping may even be a real problem in
> some cases. There may be cases where an explicit try...finally remains
> better, simply to avoid an unwanted looping behaviour.
> 

Which I think is actually fine if they do just use a try/finally if it fits the
situation better.

> But I'll live with that to get back the enthusiasm for a new feature
> that started all of this. Much better than the current "yes, I guess
> that's good enough" tone to the discussion.
> 

Ditto.

-Brett

From ncoghlan at gmail.com  Mon May 16 00:03:02 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 16 May 2005 08:03:02 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <d11dcfba050515071553dca64c@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>	<ca471dc205051410435473d2b2@mail.gmail.com>	<4286F8CB.4090001@gmail.com>
	<d11dcfba050515071553dca64c@mail.gmail.com>
Message-ID: <4287C716.3060302@gmail.com>

Steven Bethard wrote:
> On 5/15/05, Nick Coghlan <ncoghlan at gmail.com> wrote:
> 
>>http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
> 
> 
>>From there I see the semantics:
> 
> VAR1 = stmt_enter() # Omit 'VAR1 =' if no 'as' clause
> exc = (None, None, None)
> try:
>     try:
>         BLOCK1
>     except:
>         exc = sys.exc_info()
> finally:
>     stmt_exit(*exc)
> 
> Don't you want a "raise" after the "exc = sys.exc_info()"?

Oops. . . yeah, that must have gotten lost somewhere along the way.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From ncoghlan at gmail.com  Mon May 16 00:18:57 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 16 May 2005 08:18:57 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <79990c6b05051506251712ad05@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>	<ca471dc205051410435473d2b2@mail.gmail.com>	<4286521E.5010703@ocf.berkeley.edu>
	<79990c6b05051506251712ad05@mail.gmail.com>
Message-ID: <4287CAD1.6090404@gmail.com>

Paul Moore wrote:
> Looping is definitely a wart. Looping may even be a real problem in
> some cases. There may be cases where an explicit try...finally remains
> better, simply to avoid an unwanted looping behaviour.

I agree PEP 343 throws away too much that was good about PEP 340 - that's why 
I'm still updating PEP 3XX as the discussion continues.

But is there anything PEP 340 does that PEP 3XX doesn't, other than letting you 
suppress exceptions?

The only example the latest version of PEP 3XX drops since the original PEP 340 
is auto_retry - and that suffers from the hidden control flow problem, so I 
doubt Guido would permit it, even *if* the new statement was once again a loop. 
And if the control flow in response to an exception can't be affected, it 
becomes even *harder* to explain how the new statement differs from a standard 
for loop.

 > But I'll live with that to get back the enthusiasm for a new feature
 > that started all of this. Much better than the current "yes, I guess
 > that's good enough" tone to the discussion.

I think the current tone is more due to the focus on addressing problems with 
all of the suggestions - that's always going to dampen enthusiasm.

Every PEP 340 use case that doesn't involve suppressing exceptions (i.e. all of 
them except auto_retry) can be written under the current PEP 3XX using 
essentially *identical* generator code (the only difference is the 
statement_template decorator at the top of the generator definition)

Pros of PEP 3XX 1.6 vs PEP 340:
  - control flow is easier to understand
  - can use inside loops without affecting break/continue
  - easy to write enter/exit methods directly on classes
  - template generators can be reused safely
  - iterator generator resource management is dealt with

Cons of PEP 3XX 1.6 vs PEP 340:
  - no suppression of exceptions (see first pro, though)

Of course, I may be biased :)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From python-dev at zesty.ca  Mon May 16 00:54:44 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Sun, 15 May 2005 17:54:44 -0500 (CDT)
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <42870B86.9070605@hathawaymix.org>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286F8CB.4090001@gmail.com> <42870B86.9070605@hathawaymix.org>
Message-ID: <Pine.LNX.4.58.0505151748430.14555@server1.LFW.org>

On Sun, 15 May 2005, Shane Hathaway wrote:
> You might add to the PEP the following example, which could really
> improve the process of building GUIs in Python:
>
>     class MyFrame(Frame):
>         def __init__(self):
>             with Panel():
>                 with VerticalBoxSizer():
>                     self.text = TextEntry()
>                     self.ok = Button('Ok')

I don't understand how this would be implemented.  Would a widget
function like 'TextEntry' set the parent of the widget according to
some global 'parent' variable?  If so, how would 'Panel' know that
its parent is supposed to be the 'MyFrame' object?


-- ?!ng

From tdelaney at avaya.com  Mon May 16 01:32:01 2005
From: tdelaney at avaya.com (Delaney, Timothy C (Timothy))
Date: Mon, 16 May 2005 09:32:01 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE721282@au3010avexu1.global.avaya.com>

Shane Hathaway wrote:

> PEP 340 is very nice, but it became less appealing to me when I saw
> what  it would do to "break" and "continue" statements.

Absolutely. I really liked PEP 340, but two things stood out to me as
being flawed:

1. Looping - it took a while for me to realise this was bugging me - the
unintuitive behaviour of break and continue clarified it for me.

2. Not re-raising exceptions automatically. I actually proposed at one
point that any exception should be re-raised at the end of the iterator
finalisation unless another exception was raised (not StopIteration,
etc). I still continue to support this. It also deals with the
control-flow issue (being hidden).

PEP 3XX has some other nice things. I have two opposing views on
for-loops supporting finalisation of iterators though:

1. for-loops should support finalisation. However, in that case every
iterator should be finalisable by default, and you should have to go out
of your way to prevent it (or prevent forced exhaustion of the
iterator). I think there's zero chance of this proposal being accepted
;)

2. for-loops should not support finalisation at all, and if you want
finalisation semantics you need to enclose the for-loop in a do/with
statement.

I think the most important thing is that the semantics must be
absolutely clear to someone looking at the code for the first time,
without knowing the particulars of the statement being used. To me that
suggests that the following 3 factors are the most important:

1. Non-looping semantics. Looping semantics require me to think more
about the actual behaviour in the presence of break/continue.

2. An exception raised in the body of the statement must be propagated
outside of the statement. I'm willing to accept another exception being
raised in its place, but in that case I think it would be a good idea to
chain the exceptions in some way. In any case, if the body of the
statement terminates abnormally, it should not be possible for the
statement to change that abnormal exit.

3. There should be a single statement (other than try...finally) that
has finalisation semantics i.e. for-loop doesn't.

Tim Delaney

From jcarlson at uci.edu  Mon May 16 02:05:11 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Sun, 15 May 2005 17:05:11 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <Pine.LNX.4.58.0505151748430.14555@server1.LFW.org>
References: <42870B86.9070605@hathawaymix.org>
	<Pine.LNX.4.58.0505151748430.14555@server1.LFW.org>
Message-ID: <20050515164940.7629.JCARLSON@uci.edu>


Ka-Ping Yee <python-dev at zesty.ca> wrote:
> 
> On Sun, 15 May 2005, Shane Hathaway wrote:
> > You might add to the PEP the following example, which could really
> > improve the process of building GUIs in Python:
> >
> >     class MyFrame(Frame):
> >         def __init__(self):
> >             with Panel():
> >                 with VerticalBoxSizer():
> >                     self.text = TextEntry()
> >                     self.ok = Button('Ok')
> 
> I don't understand how this would be implemented.  Would a widget
> function like 'TextEntry' set the parent of the widget according to
> some global 'parent' variable?  If so, how would 'Panel' know that
> its parent is supposed to be the 'MyFrame' object?

It would actually take a bit more to make work properly.

If those objects were aware of the resource allocation mechanism, they
could add and remove themselves from a context stack as necessary.  In
the case of things like VerticalBoxSizer, save the current self
dictionary on entrance, then check for changes on exit, performing an
Add with all the new objects. Or, so that it doesn't change the way
wxPython works with other versions of Python, everything could be
wrapped, perhaps using something like...

class MyFrame(Frame):
    def __init__(self):
        with new_context(self):
            with parented(Panel, self) as panel:
                with unparented(VerticalBoxSizer, 'Add') as panel.sizer:
                    self.text = TextEntry(panel)
                    self.ok = Button(panel, 'Ok')

There would be a little more work involved to generate a reasonable API
for this, but it is all possible.

 - Josiah


From gvanrossum at gmail.com  Mon May 16 02:50:55 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Sun, 15 May 2005 17:50:55 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <d11dcfba050515071553dca64c@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286F8CB.4090001@gmail.com>
	<d11dcfba050515071553dca64c@mail.gmail.com>
Message-ID: <ca471dc205051517501b99e8b8@mail.gmail.com>

[Nick Coghlan]
> > http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

[Steven Bethard]
> there I see the semantics:
> 
> VAR1 = stmt_enter() # Omit 'VAR1 =' if no 'as' clause
> exc = (None, None, None)
> try:
>     try:
>         BLOCK1
>     except:
>         exc = sys.exc_info()
> finally:
>     stmt_exit(*exc)
> 
> Don't you want a "raise" after the "exc = sys.exc_info()"?

I have the same question for Nick. Interestingly, assuming Nick meant
that "raise" to be there, PEP 3XX and PEP 343 now have the same
translation. In rev 1.10 I moved the __enter__ call out of the
try-block again. Having it inside was insane: when __enter__ fails, it
should do its own cleanup rather than expecting __exit__ to clean up
after a partial __enter__.

But some of the claims from PEP 3XX seem to be incorrect now: Nick
claims that a with-statement can abstract an except clause, but that's
not the case; an except clause causes the control flow to go forward
(continue after the whole try statement) but the with-statement (with
the "raise" added) always acts like a finally-clause, which implicitly
re-raises the exception. So, in particular, in this example:

with EXPR1:
    1/0
print "Boo"

the print statement is unreachable and there's nothing clever you can
put in an __exit__ method to make it reachable. Just like in this
case:

try:
    1/0
finally:
    BLOCK1
print "Boo"

there's nothing that BLOCK1 can to to cause the print statement to be reached.

This claim in PEP 3XX may be a remnant from a previous version; or it
may be that Nick misunderstands how 'finally' works.

Anyway, I think we may be really close at this point, if we can agree
on an API for passing exceptions into generators and finalizing them,
so that the generator can be written using a try/finally around the
yield statement.

Of course, it's also possible that Nick did *not* mean for the missing
"raise" to be there. But in that case other claims from his PEP become
false, so I'm assuming with Steven here. Nick?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Mon May 16 03:21:14 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Sun, 15 May 2005 21:21:14 -0400
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4287CAD1.6090404@gmail.com>
References: <79990c6b05051506251712ad05@mail.gmail.com>
	<ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286521E.5010703@ocf.berkeley.edu>
	<79990c6b05051506251712ad05@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050515211811.01d52038@mail.telecommunity.com>

At 08:18 AM 5/16/2005 +1000, Nick Coghlan wrote:
>Paul Moore wrote:
> > Looping is definitely a wart. Looping may even be a real problem in
> > some cases. There may be cases where an explicit try...finally remains
> > better, simply to avoid an unwanted looping behaviour.
>
>I agree PEP 343 throws away too much that was good about PEP 340 - that's why
>I'm still updating PEP 3XX as the discussion continues.

Could you please stop calling it PEP 3XX and go ahead and submit it as a 
real PEP?  Either that, or post its URL *every* time you mention it, 
because at this point I don't know where to go to read it, and the same 
applies for each new person to enter the discussion.  Thanks.


From shane at hathawaymix.org  Mon May 16 04:11:02 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Sun, 15 May 2005 20:11:02 -0600
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <Pine.LNX.4.58.0505151748430.14555@server1.LFW.org>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286F8CB.4090001@gmail.com> <42870B86.9070605@hathawaymix.org>
	<Pine.LNX.4.58.0505151748430.14555@server1.LFW.org>
Message-ID: <42880136.6020403@hathawaymix.org>

Ka-Ping Yee wrote:
> On Sun, 15 May 2005, Shane Hathaway wrote:
> 
>>You might add to the PEP the following example, which could really
>>improve the process of building GUIs in Python:
>>
>>    class MyFrame(Frame):
>>        def __init__(self):
>>            with Panel():
>>                with VerticalBoxSizer():
>>                    self.text = TextEntry()
>>                    self.ok = Button('Ok')
> 
> 
> I don't understand how this would be implemented.  Would a widget
> function like 'TextEntry' set the parent of the widget according to
> some global 'parent' variable?  If so, how would 'Panel' know that
> its parent is supposed to be the 'MyFrame' object?

Try this version, which I sent to Nick earlier:

    class MyFrame(Frame):
        def __init__(self):
            with Panel(self):
                with VerticalBoxSizer(self):
                    self.text = TextEntry(self)
                    self.ok = Button(self, 'Ok')

The 'self' parameter tells the component to add itself to the current
parent inside the Frame.  The current parent is a temporary variable set
by 'with' statements.  Outside any 'with' statement, the current parent
is the frame.

There is only a little magic.  Maybe someone can find an even less
magical pattern, but it's a lot easier to read and write than the status
quo:

    class MyFrame(Frame):
        def __init__(self):
            p = Panel()
            self.add(p)
            sizer = VerticalBoxSizer(p)
            p.add(sizer)
            self.text = TextEntry()
            sizer.add(self.text)
            self.ok = Button('Ok')
            sizer.add(self.ok)

Shane

From rrr at ronadam.com  Mon May 16 05:21:12 2005
From: rrr at ronadam.com (Ron Adam)
Date: Sun, 15 May 2005 23:21:12 -0400
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4286F8CB.4090001@gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286F8CB.4090001@gmail.com>
Message-ID: <428811A8.9000706@ronadam.com>


It's been interesting watching all the loops this discussion has gone 
through.  I'm not sure the following is compatible with the current 
proposals, but maybe it will spur some ideas or help rule out something.

There have been several examples of problems with opening several 
resources inside an enter method and how to resolve them in the case of 
an incomplete entry.

So it seams to me, each resource needs to be handled separately, but 
that doesn't mean you can't use them as a group.

So I was wondering if something like the following is feasible?


# open-close pairing
def opening(filename, mode):
     def openfile():
         f = open(filename, mode)
         try:
             yield f
         finally:
             f.close()
     return openfile

with opening(file1,m),opening(file2,m),opening(file3,m) as f1,f2,f3:
     # do stuff with files


The 'with' (or whatever) statement would need a little more under the 
hood, but it might simplify handling multiple resources.

This also reduces nesting in cases such as locking and opening. Both 
must succeed before the block executes. And if something goes wrong, the 
"with" statement knows and can handle each resource.  The point is, each 
resource needs to be a whole unit, opening multiple files in one 
resource handler is probably not a good idea anyway.

Regards,
_Ron Adam


From adv at langdale.com.au  Mon May 16 05:52:59 2005
From: adv at langdale.com.au (Arnold deVos)
Date: Mon, 16 May 2005 13:52:59 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc205051517501b99e8b8@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>	<ca471dc205051410435473d2b2@mail.gmail.com>	<4286F8CB.4090001@gmail.com>	<d11dcfba050515071553dca64c@mail.gmail.com>
	<ca471dc205051517501b99e8b8@mail.gmail.com>
Message-ID: <d695e7$klf$1@sea.gmane.org>

Guido van Rossum wrote:
> [...] But some of the claims from PEP 3XX seem to be incorrect now: Nick
> claims that a with-statement can abstract an except clause, but that's
> not the case; [...]

Sorry for being a lurker, but can I try and expand this point.

The options:

- If we don't allow the except clause in the generator, the exception 
can't be examined there.

- If we do allow the except clause we must (IMO) also allow the 
generator to suppress the exception. It would be surprising behaviour if 
an a caught exception was re-raised without an explicit raise statement.

An argument:

Despite the control-flow-macros-are-harmful discussion, I see nothing 
wrong with a block controller swallowing its block's exceptions because:

- In most proposals it can raise its own exception in place of the 
block's exception anyway.

- In the following example there is nothing surprising if controller() 
swallows block()'s exception:

def block():
	# do stuff
	raise E
controller(block)

Perhaps we don't want the block controller statement to have as much 
power over its block as controller() has over block() above. But 
handling an exception is not so radical is it?

- Arnold.


From greg.ewing at canterbury.ac.nz  Mon May 16 06:51:49 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 16 May 2005 16:51:49 +1200
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <5.1.1.6.0.20050513204536.021d5900@mail.telecommunity.com>
References: <5.1.1.6.0.20050513204536.021d5900@mail.telecommunity.com>
Message-ID: <428826E5.8060102@canterbury.ac.nz>

Phillip J. Eby wrote:
> This makes it seem awkward for e.g. "do self.__lock", which doesn't 
> make any sense.  But the extra call needed to make it "do 
> locking(self.__lock)" seems sort of gratuitous.

How about

   do holding(self.__lock):
     ...

> It makes me wonder if "with" or "using" or some similar word that works 
> better with nouns might be more appropriate ...   For
> example, a Decimal Context object might implement __enter__ by setting 
> itself as the thread-local context, and __exit__ by restoring the previous 
> context.    "do aDecimalContext" doesn't make much sense, but "with 
> aDecimalContext" or "using aDecimalContext" reads quite nicely.

It doesn't work so well when you don't already have an
object with one obvious interpretation of what you want
to do 'with' it, e.g. you have a pathname and you want
to open a file. I've already argued against giving file
objects __enter__ and __exit__ methods. And I'm -42 on
giving them to strings. :-)

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Mon May 16 06:57:19 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 16 May 2005 16:57:19 +1200
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc2050513205855dcba6e@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<5.1.1.6.0.20050513204536.021d5900@mail.telecommunity.com>
	<ca471dc2050513205855dcba6e@mail.gmail.com>
Message-ID: <4288282F.1050104@canterbury.ac.nz>

Guido van Rossum wrote:

>>Also, one question: will the "do protocol" be added to built-in "resource"
>>types?  That is, locks, files, sockets, and so on?
> 
> One person proposed that and it was shot down by Greg Ewing. I think
> it's better to require a separate wrapper.

It depends on whether the resource is "reusable". It
would be okay for locks since you can lock and unlock
the same lock as many times as you want, but files
and sockets can only be used once, so there has to
be something else around them.

ALthough if we use 'do', we might want to use wrappers
anyway for readability, even if they're not semantically
necessary.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From rrr at ronadam.com  Mon May 16 07:04:15 2005
From: rrr at ronadam.com (Ron Adam)
Date: Mon, 16 May 2005 01:04:15 -0400
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <428811A8.9000706@ronadam.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>	<ca471dc205051410435473d2b2@mail.gmail.com>	<4286F8CB.4090001@gmail.com>
	<428811A8.9000706@ronadam.com>
Message-ID: <428829CF.5010707@ronadam.com>


A additional comment (or 2) on my previous message before I go back to 
lurk mode.

If the recommended use of each resource template is kept to a single 
resource, then each enter and exit can be considered a whole block of 
code that will either pass or fail. You can then simplify the previous 
template to just:


# open-close pairing
def opening(filename, mode):
     def openfile():
         f = open(filename, mode)
         yield f
         f.close()
     return openfile

with opening(file1,m),opening(file2,m),opening(file3,m) as f1,f2,f3:
     # do stuff with files


The with statement will need to catch any opening errors at the start in 
case one of the opening resources fails.  Close any opened resources, 
then re raise the first exception.

The with statement will also need to catch any uncaught exceptions in 
the block, close any opened resources, then re raise the exception.

And again when closing, it will need to catch any exceptions that occur 
until it has tried to close all open resources, then re raise the first 
exception.

Although it's possible to have more than one exception occur, it should 
always raise the first most one as any secondary exceptions may just be 
a side effect of the first one.

The programmer has the option to surround the 'with' block with a try 
except if he want to catch any exceptions raised.

He should also be able to put try excepts before the yield, and after 
the yield, or in the block. (But not surrounding the yield, I think.)

Of course he may cause himself more problems than not, but that should 
be his choice, and maybe he thought of some use.

This might be an acceptable use case of try-except in the enter section.

def openfile(firstfile, altfile, mode):
     try:
         f = open(firstfile, mode)
     except:
         f = open(altfile, mode)
     yield f
     f.close()
return openfile


This is still a single open resource (if it succeeds) so should work ok.

Alternate closing could be possible, maybe retrying the close several 
time before raising it and letting the 'with' handle it.

Cheers,
_Ron




From greg.ewing at canterbury.ac.nz  Mon May 16 07:11:29 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 16 May 2005 17:11:29 +1200
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <d64ous$v7a$1@sea.gmane.org>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org> <4285D29F.7080100@gmail.com>
	<d64ous$v7a$1@sea.gmane.org>
Message-ID: <42882B81.7030304@canterbury.ac.nz>

Fredrik Lundh wrote:

>     try with opening(file) as f:
>         body
>     except IOError:
>         deal with the error (you have to do this anyway)

You don't usually want to do it right *there*, though.
More likely you'll have something further up that deals
with a variety of possible errors.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From python-dev at zesty.ca  Mon May 16 07:30:58 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Mon, 16 May 2005 00:30:58 -0500 (CDT)
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc205051517501b99e8b8@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286F8CB.4090001@gmail.com>
	<d11dcfba050515071553dca64c@mail.gmail.com>
	<ca471dc205051517501b99e8b8@mail.gmail.com>
Message-ID: <Pine.LNX.4.58.0505160027310.14555@server1.LFW.org>

On Sun, 15 May 2005, Guido van Rossum wrote:
> In rev 1.10 I moved the __enter__ call out of the
> try-block again. Having it inside was insane: when __enter__ fails, it
> should do its own cleanup rather than expecting __exit__ to clean up
> after a partial __enter__.

No, it wasn't insane.  You had a good reason for putting it there.

The question is what style of implementation you want to encourage.

If you put __enter__ inside, then you encourage idempotent __exit__,
which makes resource objects easier to reuse.

If you put __enter__ outside, that allows the trivial case to be
written a little more simply, but also makes it hard to reuse.


-- ?!ng

From steven.bethard at gmail.com  Mon May 16 07:53:52 2005
From: steven.bethard at gmail.com (Steven Bethard)
Date: Sun, 15 May 2005 23:53:52 -0600
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4286F8CB.4090001@gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286F8CB.4090001@gmail.com>
Message-ID: <d11dcfba050515225358feaaa9@mail.gmail.com>

On 5/15/05, Nick Coghlan <ncoghlan at gmail.com> wrote:
> http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

In reading over PEP 3XX again, it struck me that I'd been having a
really hard time grasping exactly when I needed to use the
"needs_finish" decorator.  Am I right in saying that I should use the
"needs_finish" decorator every time I have a "yield" inside a
with-statement or a try/finally?  Are there other situations where I
might need the "needs_finish" decorator?

If it's true that I need the "needs_finish" decorator every time I
have a "yield" inside a with-statement or a try/finally, I'd be
inclined to do this automatically.  That is, since a yield inside a
with-statement or try/finally can be determined lexically (heck, we do
it now to disallow it), generators that have such code should be
automatically wrapped with the "needs_finish" decorator, i.e. they
should automatically acquire a __finish__ method.

If I've misunderstood, and there are other situations when
"needs_finish" is required, it'd be nice to see some more examples.

STeVe
-- 
You can wordify anything if you just verb it.
        --- Bucky Katt, Get Fuzzy

From tdelaney at avaya.com  Mon May 16 07:59:41 2005
From: tdelaney at avaya.com (Delaney, Timothy C (Timothy))
Date: Mon, 16 May 2005 15:59:41 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE721283@au3010avexu1.global.avaya.com>

Steven Bethard wrote:

> If I've misunderstood, and there are other situations when
> "needs_finish" is required, it'd be nice to see some more examples.

The other cases are where you want to do something in response to an
exception, but not otherwise::

    def gen():
        try:
            yield
        except:
            print 'Got exception:', sys.exc_info()
            raise

Personally, I think they're rare enough that you could use a decorator
in those cases, but still have::

    def gen():
        try:
            yield
        finally:
            pass

automatically make the generator conform to the do/with protocol.

Tim Delaney

From greg.ewing at canterbury.ac.nz  Mon May 16 08:07:53 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 16 May 2005 18:07:53 +1200
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc2050514081229edefd4@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org> <4285D29F.7080100@gmail.com>
	<d64ous$v7a$1@sea.gmane.org>
	<79990c6b0505140608277706a7@mail.gmail.com>
	<42860935.4080607@gmail.com>
	<ca471dc2050514081229edefd4@mail.gmail.com>
Message-ID: <428838B9.9010500@canterbury.ac.nz>

Guido van Rossum wrote:

> But then the reason for separating VAR from EXPR becomes unclear.
> Several people have mentioned that they thought this was "a good idea
> on its own", but without giving additional use cases. Without the
> ability to write the acquire/release template as a generator, the big
> question is, "why not just PEP 310" ?

Here's another use case:

In PyGUI, in order to abstract the various ways that
different platforms deal with drawing contexts, my
widgets currently have a method

   widget.with_canvas(func)

where you define func as

   def func(canvas):
     # do drawing operations on the canvas

The canvas is a separate object from the widget so
that it's harder to make the mistake of trying to
draw to the widget outside of the appropriate context.

The with-statement form of this would be

   with widget.canvas() as c:
     # do drawing operations on c

Keeping the VAR and EXPR separate in this case better
reflects the semantics of the original with_canvas()
function. The canvas is strictly a local object
produced as a result of executing the __enter__
method, which helps ensure that the correct protocol
is followed -- if you haven't called __enter__, you
don't have a canvas, so you can't do any drawing.

On the other hand, this leads to some awkwardness
in the naming conventions. The canvas() method,
despite its name, doesn't actually return a canvas,
but another object which, when used in the right
way, produces a canvas.

In general, the names of methods for use in a
with-statement will need to be named according
to the object which is bound to the VAR, rather
than what they actually return.

I'm not sure whether this is a problem or not.
There's already a similar situation with generators,
which are more usefully named according to what
they yield, rather than what they return when
considered as a function. I just feel a bit
uncomfortable giving my widgets a function called
canvas() that doesn't return a canvas.

The alternative is that the canvas() method *does*
return a canvas, with __enter__ and __exit__
methods, and the rule that you have to use the
appropriate protocol before you can use it. This
would avoid most of the aforementioned problems.

So I think I may have just talked myself out of
what I was originally intending to argue!

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Mon May 16 08:11:50 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 16 May 2005 18:11:50 +1200
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4285C6E0.90307@gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<4285C6E0.90307@gmail.com>
Message-ID: <428839A6.5080409@canterbury.ac.nz>

Nick Coghlan wrote:

> The naming convention for 'do' is shown in the current PEP 343. The issue I've 
> noticed with it is that *functions* read well, but methods don't because things 
> get out of sequence. That is, "do locking(the_lock)" reads well, but "do 
> the_lock.locking()" does not.
> 
> Whereas, using 'with', it can be written either way, and still read reasonably 
> well ("with locked(the_lock)", "with the_lock.locked()").
> 
> The 'with' keyword also reads better if objects natively support use in 'with' 
> blocks ("with the_lock", "with the_file").
> 
> Guido's concern regarding file objects being reused inappropriately can be dealt 
> with in the file __enter__ method:
> 
>    def __enter__(self):
>        if self.closed:
>            raise RuntimeError, "Cannot reopen closed file handle"
> 
> For files, it may then become the common practice to keep pathnames around, 
> rather than open file handles. When you actually needed access to the file, the 
> existing "open" builtin would suffice:
> 
>    with open(filename, "rb") as f:
>        for line in f:
>            print line

I think I'm starting to agree. Currently about +0.6 in
favour of 'with' now, especially if this is to be
almost exclusively a resource-acquisition statement,
as all our use cases seem to be.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Mon May 16 08:21:57 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 16 May 2005 18:21:57 +1200
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc205051410435473d2b2@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
Message-ID: <42883C05.705@canterbury.ac.nz>

Guido van Rossum wrote:

> PEP 340 is still my favorite, but it seems there's too much opposition
> to it,

I'm not opposed to PEP 340 in principle, but the
ramifications seemed to be getting extraordinarily
complicated, and it seems to be hamstrung by
various backwards-compatibility constraints.
E.g. it seems we can't make for-loops work the way
they should in the face of generator finalisation
or we break old code.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Mon May 16 08:24:59 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 16 May 2005 18:24:59 +1200
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4286521E.5010703@ocf.berkeley.edu>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286521E.5010703@ocf.berkeley.edu>
Message-ID: <42883CBB.4080303@canterbury.ac.nz>

Brett C. wrote:

> Nick's was obviously directly against looping, but, with no offense to Nick,
> how many other people were against it looping?  It never felt like it was a
> screaming mass with pitchforks but more of a "I don't love it, but I can deal"
> crowd.

My problem with looping was that, with it, the semantics
of a block statement would be almost, but not quite,
exactly like those of a for-loop, which seems to be
flying in the face of TOOWTDI. And if it weren't for
the can't-finalise-generators-in-a-for-loop backward
compatibility problem, the difference would be even
smaller.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Mon May 16 09:34:22 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 16 May 2005 19:34:22 +1200
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286F8CB.4090001@gmail.com> <428811A8.9000706@ronadam.com>
	<428829CF.5010707@ronadam.com>
Message-ID: <42884CFE.1050608@canterbury.ac.nz>

Ron Adam wrote:

> He should also be able to put try excepts before the yield, and after 
> the yield, or in the block. (But not surrounding the yield, I think.)

I was given to understand that yield is currently
allowed in try-except, just not try-finally. So
this would require a non-backwards-compatible
change.

Greg


From greg.ewing at canterbury.ac.nz  Mon May 16 09:49:35 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 16 May 2005 19:49:35 +1200
Subject: [Python-Dev] Tidier Exceptions
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>
	<ca471dc2050512160977cc60f@mail.gmail.com>
	<4283E9F6.1020500@ocf.berkeley.edu>
	<428437DE.60204@canterbury.ac.nz> <20050514183429.GA12001@panix.com>
Message-ID: <4288508F.5030906@canterbury.ac.nz>

Aahz wrote:
> On Fri, May 13, 2005, Greg Ewing wrote:
>
>>Instead of an 'args' attribute, I'd suggest that the constructor take
>>keyword arguments and store them in corresponding attributes.
> 
> Sounds reasonable, but it should be equally easy to handle::
> 
>     raise MyError, "message"

Certainly, I wasn't suggesting otherwise.

To be more explicit, the base exception class would
look something like

   class Exception:

     def __init__(self, message = None, **kwds):
       self.message = message
       self.__dict__.update(kwds)

     def __str__(self):
       if self.message is not None:
         return self.message
       else:
         return self.__class__.__name__

Greg


From ncoghlan at gmail.com  Mon May 16 10:56:01 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 16 May 2005 18:56:01 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <5.1.1.6.0.20050515211811.01d52038@mail.telecommunity.com>
References: <79990c6b05051506251712ad05@mail.gmail.com>
	<ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286521E.5010703@ocf.berkeley.edu>
	<79990c6b05051506251712ad05@mail.gmail.com>
	<5.1.1.6.0.20050515211811.01d52038@mail.telecommunity.com>
Message-ID: <42886021.50302@gmail.com>

Phillip J. Eby wrote:
> At 08:18 AM 5/16/2005 +1000, Nick Coghlan wrote:
> 
>> Paul Moore wrote:
>> > Looping is definitely a wart. Looping may even be a real problem in
>> > some cases. There may be cases where an explicit try...finally remains
>> > better, simply to avoid an unwanted looping behaviour.
>>
>> I agree PEP 343 throws away too much that was good about PEP 340 - 
>> that's why
>> I'm still updating PEP 3XX as the discussion continues.
> 
> 
> Could you please stop calling it PEP 3XX and go ahead and submit it as a 
> real PEP?  Either that, or post its URL *every* time you mention it, 
> because at this point I don't know where to go to read it, and the same 
> applies for each new person to enter the discussion.  Thanks.

Sorry about that - I've been including the URL most of the time, but forgot on 
this occasion:
http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

Anyway, I think it's stable enough now that I can submit it to be put up on 
www.python.org (I'll notify the PEP editors directly once I fix a couple of 
errors in the current version - like the missing 'raise' in the statement 
semantics. . .).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From ncoghlan at gmail.com  Mon May 16 11:30:44 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 16 May 2005 19:30:44 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <d11dcfba050515225358feaaa9@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>	<ca471dc205051410435473d2b2@mail.gmail.com>	<4286F8CB.4090001@gmail.com>
	<d11dcfba050515225358feaaa9@mail.gmail.com>
Message-ID: <42886844.8060102@gmail.com>

Steven Bethard wrote:
> If I've misunderstood, and there are other situations when
> "needs_finish" is required, it'd be nice to see some more examples.

The problem is try/except/else blocks - those are currently legal, so the 
programmer has to make the call about whether finalisation is needed or not.

I'll put this in the Open Issues section of the PEP - doing it lexically seems a 
little too magical for my taste (since it suddenly becomes more difficult to do 
partial iteration on the generator), but the decorator is a definite wart.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From p.f.moore at gmail.com  Mon May 16 11:32:30 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Mon, 16 May 2005 10:32:30 +0100
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <d11dcfba050515102736cdd7da@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286521E.5010703@ocf.berkeley.edu>
	<79990c6b05051506251712ad05@mail.gmail.com>
	<d11dcfba050515102736cdd7da@mail.gmail.com>
Message-ID: <79990c6b05051602321629230c@mail.gmail.com>

On 5/15/05, Steven Bethard <steven.bethard at gmail.com> wrote:
> Having done the python-dev summary on this topic,

You have my deepest sympathy :-)

> So in some sense, PEP 340 was the reason for the lack of "enthusiasm";
> with the semantics laid out, people were forced to deal with a specific
> implementation instead of a variety of wild suggestions.

I'm not sure I agree with that - to me, PEP 340 felt like the
consolidation of the previous discussion. My feeling was "cool - we've
had the discussion, now we've formalised the results, maybe a few
details to tidy up and then we can see the implementation being
checked in". Then Nick's proposal *failed* to feel like the tidying up
of the details, and PEP 343 felt like giving up on the powerful (but
hard) bits. It's all people's impressions, though, so maybe I'm just
bitter and cynical :-)

Interestingly, some new ideas have started appearing again (the GUI
example someone raised yesterday, for instance). But with the current
"multiple PEPs" situation, I can't evaluate such ideas, as I've no
clue which of the various proposals would support them.

> > No-one is looking at PEP 343, or
> > Nick's PEP 3XX, and saying "hey, that's neat - I can do XXX with
> > that!". This makes me feel that we've thrown out the baby with the
> > bathwater.
> 
> I'd be surprised if you can find many examples that PEP 340 can do
> that PEP 3XX can't.

In which cask, Nick is "marketing" it really badly - I hadn't got that
impression at all. And if Nick's proposal really *is* PEP 340 with the
issues people had resolved, how come Guido isn't supporting it?

(By the way, I agree with Philip Eby - Nick's proposal really needs to
be issued as a proper PEP - although if it's that close to just being
a fix for PEP 340, it should probably just be the new version of PEP
340).

> > (Yes, I know PEP 342 is integral to many of the neat
> > features, but I get the impression that PEP 342 is being lost - later
> > iterations of the other two PEPs are going out of their way to avoid
> > assuming PEP 324 is implemented...)
> 
> Not very far out of their way.

Well, PEP 343 uses

    def template():
        before
        yield
        after

rather than

    def template():
        before
        try:
            yield
        finally:
            after

which I would argue is better - but it needs PEP 342 functionality.
OTOH, Guido argues there are other reasons for the PEP 343 style.

Also, the discussion has moved to resource objects with special
methods rather than generators as templates - which I see as a direct
consequence of PEP 342 being excluded. One of the things I really
liked about PEP 340 was the "generator template" style of code, with
yield as the "block goes here" placeolder

> If you want to get people enthused about PEP 342 again (which is the
> right way to make sure it gets excepted), what would really help is a
> bunch of good examples of how it could be used.

In my view, *every* PEP 340/343/3XX example when written in generator
form counts as a good example (see above). Neat coroutine tricks and
the like are the "additional benefits" - maybe bordering on abuses in
some cases, so I don't want to focus on them in case we get into
arguments about whether a feature is bad simply because it *can* be
abused... But the key use case, for me, *is* the generator-as-template
feature.

> I'm convinced that such a tone is inevitable after 30 days and over
> 700 messages on *any* topic. ;-)

Which is why I regret that Guido didn't just go ahead and implement
it, consensus be damned :-) I vote for a dictatorship :-)

> Ok, back to summarizing this fortnight's 380+ PEP 340 messages. ;-)

Best of luck - and in case you need a motivation boost, can I just say
that this sort of thread is why I have the greatest respect and
appreciation for the job the summarisers do.

Paul.

From ncoghlan at gmail.com  Mon May 16 13:46:40 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 16 May 2005 21:46:40 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <42886844.8060102@gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>	<ca471dc205051410435473d2b2@mail.gmail.com>	<4286F8CB.4090001@gmail.com>	<d11dcfba050515225358feaaa9@mail.gmail.com>
	<42886844.8060102@gmail.com>
Message-ID: <42888820.8010009@gmail.com>

Nick Coghlan wrote:
> Steven Bethard wrote:
> 
>>If I've misunderstood, and there are other situations when
>>"needs_finish" is required, it'd be nice to see some more examples.
> 
> 
> The problem is try/except/else blocks - those are currently legal, so the 
> programmer has to make the call about whether finalisation is needed or not.
> 
> I'll put this in the Open Issues section of the PEP - doing it lexically seems a 
> little too magical for my taste (since it suddenly becomes more difficult to do 
> partial iteration on the generator), but the decorator is a definite wart.

I had a better idea - with a decorator being used to create statement templates 
out of generators, that means the __enter__() and __exit__() methods of 
generators themselves are available to handle finalisation.

So, just as my PEP suggests that files be usable like:

   with open(filename) as f:
       for line in f:
           print line

It now suggests generators be usable like:

   with all_lines(filenames) as lines:
       for line in lines:
           print line

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From tzot at mediconsa.com  Mon May 16 01:40:11 2005
From: tzot at mediconsa.com (Christos Georgiou)
Date: Mon, 16 May 2005 02:40:11 +0300
Subject: [Python-Dev] PEP 340 keyword: after
References: <427A185A.90504@v.loewis.de><20050505102339.7b745670@localhost.localdomain>
	<loom.20050505T164909-961@post.gmane.org>
Message-ID: <d69iju$gv7$2@sea.gmane.org>


"Chris Ryland" <cpr at emsoftware.com> wrote in message 
news:loom.20050505T164909-961 at post.gmane.org...

> I hate to add to what could be an endless discussion, but... ;-)
>
> In this case, "while" is the better time-related prefix, whether
> keyword (hopeless, due to ages-old boolean-controlled loop association)
> or function, since you want to imply that the code block is going
> on *while* the lock is held or *while* the file is open (and you also
> want to imply that afterwards, something else happens, i.e., cleanup).
>
> while_locked(myLock):
>    # code that needs to hold the lock

Ah.  You mean 'during' :) 



From ncoghlan at gmail.com  Mon May 16 13:53:01 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 16 May 2005 21:53:01 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc205051517501b99e8b8@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	
	<d64imp$fai$1@sea.gmane.org>	
	<ca471dc205051410435473d2b2@mail.gmail.com>	
	<4286F8CB.4090001@gmail.com>	
	<d11dcfba050515071553dca64c@mail.gmail.com>
	<ca471dc205051517501b99e8b8@mail.gmail.com>
Message-ID: <4288899D.1060705@gmail.com>

Guido van Rossum wrote:
> [Nick Coghlan]
> 
>>>http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

> I have the same question for Nick. Interestingly, assuming Nick meant
> that "raise" to be there, PEP 3XX and PEP 343 now have the same
> translation. In rev 1.10 I moved the __enter__ call out of the
> try-block again. Having it inside was insane: when __enter__ fails, it
> should do its own cleanup rather than expecting __exit__ to clean up
> after a partial __enter__.

Are you sure? The copy I see on python.org still has it inside the try/finally.

But yes, the differences between PEP 343 and PEP 3XX [1] are not huge, 
particularly if __enter__ is called outside the try/finally block.

The key difference is whether or not exceptions are injected into the generators 
internal frame so that templates can be written using the style from PEP 340.

> But some of the claims from PEP 3XX seem to be incorrect now: Nick
> claims that a with-statement can abstract an except clause, but that's
> not the case; an except clause causes the control flow to go forward
> (continue after the whole try statement) but the with-statement (with
> the "raise" added) always acts like a finally-clause, which implicitly
> re-raises the exception.

Steven's correct - there's a raise statement missing. The point I'm trying to 
make in the PEP is that, even without the ability to suppress exceptions, the 
__exit__() statement can still react to them. Then the only code that needs to 
be repeated at the calling site is the actual suppression of the exception.

Whether doing such a thing makes sense is going to be application dependent, of 
course.

> Anyway, I think we may be really close at this point, if we can agree
> on an API for passing exceptions into generators and finalizing them,
> so that the generator can be written using a try/finally around the
> yield statement.

My PEP punts on providing a general API for passing exceptions into generators 
by making it an internal operation.

The version I submitted to the PEP editors uses __enter__() and __exit__() to 
handle finalisation, though.

Cheers,
Nick.

[1] I finally submitted it to the PEP editors, so it'll be up on python.org as 
soon as they find the time to check it in.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From ncoghlan at gmail.com  Mon May 16 14:25:49 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 16 May 2005 22:25:49 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <79990c6b05051602321629230c@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>	<ca471dc205051410435473d2b2@mail.gmail.com>	<4286521E.5010703@ocf.berkeley.edu>	<79990c6b05051506251712ad05@mail.gmail.com>	<d11dcfba050515102736cdd7da@mail.gmail.com>
	<79990c6b05051602321629230c@mail.gmail.com>
Message-ID: <4288914D.4020003@gmail.com>

Paul Moore wrote:
> I'm not sure I agree with that - to me, PEP 340 felt like the
> consolidation of the previous discussion. My feeling was "cool - we've
> had the discussion, now we've formalised the results, maybe a few
> details to tidy up and then we can see the implementation being
> checked in". Then Nick's proposal *failed* to feel like the tidying up
> of the details, and PEP 343 felt like giving up on the powerful (but
> hard) bits. It's all people's impressions, though, so maybe I'm just
> bitter and cynical :-)

Actually, I agree that the early versions of my proposal failed to tidy things 
up (I was honestly trying - I just didn't succeed).

I'm pretty happy with the version I just submitted to the PEP editors, though:
http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

It gives generators inherent ``with`` statement support in order to handle 
finalisation (similar to the way it suggests files support the ``with`` 
statement in addition to ``for`` loops).

An appropriately defined decorator then provides the ability to write ``with`` 
statement templates using the PEP 340 generator-based style.

>>I'd be surprised if you can find many examples that PEP 340 can do
>>that PEP 3XX can't.
> 
> In which cask, Nick is "marketing" it really badly - I hadn't got that
> impression at all. And if Nick's proposal really *is* PEP 340 with the
> issues people had resolved, how come Guido isn't supporting it?

I think this is because I've been changing it too fast. This is also the main 
reason I hadn't submitted before now - *I* wasn't happy with it, so I wanted to 
keep it easy to update for a while longer.

I hope that the version that appears on python.org will garner a bit more 
support. I submitted it because I finally felt I'd achieved what I set out to do 
(cleanly integrate PEP 310 and PEP 340), whereas none of my previous drafts felt 
that way.

> Also, the discussion has moved to resource objects with special
> methods rather than generators as templates - which I see as a direct
> consequence of PEP 342 being excluded. One of the things I really
> liked about PEP 340 was the "generator template" style of code, with
> yield as the "block goes here" placeolder

My PEP suggests simple cases are best handled by __enter__/__exit__ methods 
directly on the classes (for efficiency, ease of use, and to avoid cluttering 
the builtin namespace), but more complex cases (like acquiring two locks, or 
using a particular lock to protect access to a certain file) be handled using 
PEP 340 style generator templates.

> In my view, *every* PEP 340/343/3XX example when written in generator
> form counts as a good example (see above). Neat coroutine tricks and
> the like are the "additional benefits" - maybe bordering on abuses in
> some cases, so I don't want to focus on them in case we get into
> arguments about whether a feature is bad simply because it *can* be
> abused... But the key use case, for me, *is* the generator-as-template
> feature.

PEP 342 isn't about the ability to inject exceptions into generators - it's 
about yield expressions and enhanced continue statements. We haven't actually 
identified any use cases for those yet (I'm wondering if there is some way to 
use the idea to implement accumulators, but getting the final result out is a 
real challenge)

PEP 340 and my PEP are the two PEP's which talk about injecting exceptions in 
order to write nice statement templates. PEP 288 is the one which would make 
that mechanism available to Python code.

> Which is why I regret that Guido didn't just go ahead and implement
> it, consensus be damned :-) I vote for a dictatorship :-)

Nah, we'll get something good at the end of it anyway - and this way we'll 
*know* it's good :)

>>Ok, back to summarizing this fortnight's 380+ PEP 340 messages. ;-)
> 
> Best of luck - and in case you need a motivation boost, can I just say
> that this sort of thread is why I have the greatest respect and
> appreciation for the job the summarisers do.

Oyah. I had a hard enough time just identifying all the ideas I wanted to 
discuss in the "Rejected Options" section of my PEP.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From aahz at pythoncraft.com  Mon May 16 15:58:17 2005
From: aahz at pythoncraft.com (Aahz)
Date: Mon, 16 May 2005 06:58:17 -0700
Subject: [Python-Dev] Tidier Exceptions
In-Reply-To: <4288508F.5030906@canterbury.ac.nz>
References: <Pine.LNX.4.58.0505121659570.14555@server1.LFW.org>
	<ca471dc2050512160977cc60f@mail.gmail.com>
	<4283E9F6.1020500@ocf.berkeley.edu>
	<428437DE.60204@canterbury.ac.nz>
	<20050514183429.GA12001@panix.com>
	<4288508F.5030906@canterbury.ac.nz>
Message-ID: <20050516135817.GB19862@panix.com>

On Mon, May 16, 2005, Greg Ewing wrote:
> Aahz wrote:
>> On Fri, May 13, 2005, Greg Ewing wrote:
>>>
>>>Instead of an 'args' attribute, I'd suggest that the constructor take
>>>keyword arguments and store them in corresponding attributes.
>> 
>> Sounds reasonable, but it should be equally easy to handle::
>> 
>>     raise MyError, "message"
> 
> To be more explicit, the base exception class would
> look something like
> 
>    class Exception:
> 
>      def __init__(self, message = None, **kwds):
>        self.message = message
>        self.__dict__.update(kwds)
> 
>      def __str__(self):
>        if self.message is not None:
>          return self.message
>        else:
>          return self.__class__.__name__

That works.
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"And if that makes me an elitist...I couldn't be happier."  --JMS

From gvanrossum at gmail.com  Mon May 16 16:33:28 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Mon, 16 May 2005 07:33:28 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <Pine.LNX.4.58.0505160027310.14555@server1.LFW.org>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286F8CB.4090001@gmail.com>
	<d11dcfba050515071553dca64c@mail.gmail.com>
	<ca471dc205051517501b99e8b8@mail.gmail.com>
	<Pine.LNX.4.58.0505160027310.14555@server1.LFW.org>
Message-ID: <ca471dc2050516073358d6ce17@mail.gmail.com>

[Guido]
> > In rev 1.10 I moved the __enter__ call out of the
> > try-block again. Having it inside was insane: when __enter__ fails, it
> > should do its own cleanup rather than expecting __exit__ to clean up
> > after a partial __enter__.

[Ka-Ping Yee]
> No, it wasn't insane.  You had a good reason for putting it there.

I did some introspection, and it was definitely a temporary moment of
insanity: (1) I somehow forgot the obvious solution (that __enter__
should clean up its own mess if it doesn't make it to the finish
line); (2) once a generator raises an exception it cannot be resumed,
so the generator-based example I gave can't work. (I was too lazy to
write down the class-based example, which *can* be made to work of
course.)

> The question is what style of implementation you want to encourage.
> 
> If you put __enter__ inside, then you encourage idempotent __exit__,
> which makes resource objects easier to reuse.

But consider threading.RLock (a lock with the semantics of Java's
monitors). Its release() is *not* idempotent, so we couldn't use the
shortcut of making __enter__ and __exit__ methods of the lock itself.
(I think this shortcut may be important for locks because it reduces
the overhead of frequent locking -- no extra objects need to be
allocated.)

> If you put __enter__ outside, that allows the trivial case to be
> written a little more simply, but also makes it hard to reuse.

I skimmed your longer post about that but didn't find the conclusive
evidence that this is so; all I saw was a lot of facts ("you can
implement scenario X in these three ways) but no conclusion.

The real reason I put it inside the try was different: there's a race
condition in the VM where it can raise a KeyboardInterrupt after the
__enter__() call completes but before the try-suite is entered, and
then __exit__() is never called. Similarly, if __enter__() is written
in Python and wraps some other operation, it may complete that other
operation but get a KeyboardInterrupt (or other asynchronous
exception) before reaching the (explicit or implicit) return
statement. Ditto for generators and yield.

But I think this can be solved differently in the actual translation
(as opposed to the "translate-to-valid-pre-2.5-Python"); the call to
__exit__ can be implicit in a new opcode, and then we can at least
guarantee that the interpreter doesn't check for interrupts between a
successful __exit__ call and setting up the finally block.

This doesn't handle an  __enter__ written in Python not reaching its
return; but in the presence of interrupts I don't think it's possible
to write such code reliably anyway; it should be written using a "with
signal.blocking()" around the critical code including the return.

I don't want to manipulate signals directly in the VM; it's
platform-specific, expensive, rarely needed, and you never know
whether you aren't invoking some Python code that might do I/O, making
the entire thread uninterruptible for a long time.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Mon May 16 19:18:27 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Mon, 16 May 2005 13:18:27 -0400
Subject: [Python-Dev] Simpler finalization semantics (was Re: PEP 343 -
 Abstract Block Redux)
Message-ID: <5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>

At 06:56 PM 5/16/2005 +1000, Nick Coghlan wrote:
>Anyway, I think it's stable enough now that I can submit it to be put up on
>www.python.org (I'll notify the PEP editors directly once I fix a couple of
>errors in the current version - like the missing 'raise' in the statement
>semantics. . .).

If you have developer checkin privileges, it's best to get a PEP number 
sooner rather than later, if the PEP shows any signs of viability at 
all.  Once you're in the PEP infrastructure people can subscribe to get 
notified when you change it, read the revision history, and so on.

Anyway, I took a look at it, and I mostly like it.  There appears to be an 
error in "Deterministic generator finalisation" (maybe you already know 
this): the _inject_exception() should be called with exc_info, not 
TerminateIteration, and it should swallow StopIteration instead of 
TerminateIteration.  IOW, I think it should look like this:

     def __exit__(self, *exc_info):
         try:
             self._inject_exception(*exc_info)
         except StopIteration:
             pass

Hm.  Oh wait, I just realized - you don't mean this at all.  You're 
describing a use of generators as non-templates.  Ugh.  I think that might 
lead to confusion about the semantics of 'with' and generators.  I'll have 
to think about it some more, but my second impression after a little bit of 
thought is that if you're going to do this, then you should be allowed to 
use 'with' with any object, using the object as VAR if there's no 
__enter__.  My reasoning here is that it then makes it possible for you to 
use arbitrary objects for 'with' without needing to know their 
implementation details.  It should be harmless to use 'with' on objects 
that don't need it.

This insight may actually be true regardless of what generators do or don't 
do; the point is that if you change from using a generator to a built-in 
iterator type, you shouldn't have to change every place you were using the 
'with' blocks to work again.

A further part of this insight: perhaps the 'with' block translation should 
include a 'del VAR' in its finally block, not to mention the equivalent of 
'del stmt_enter,stmt_exit'.  In other words, the binding of VAR should not 
escape the 'with' block.  This would mean that for existing types that use 
__del__ for cleanup (e.g. files and sockets), then 'with open("file") as f' 
would automatically ensure closing under CPython (but other implementations 
would be allowed to wait for GC).  In other words, I'm saying that this:

      with some_expr() as foo:
          # etc.

should also be short for this (in the case where some_expr() has no 
__enter__ or __exit__ methods):

      foo = some_expr()
      try:
          # etc.
      finally:
          del foo

And that could be a useful thing for many existing object types, without 
even updating them for PEP 34[0-9].  :)  It wouldn't be *as* useful for 
non-CPython implementations, but presumably by the time those 
implementations catch up, more code will be out there with 
__enter__/__exit__ methods.  Also, by allowing a default __enter__ to exist 
(that returns self), many objects need only implement an __exit__.  (For 
example, I don't see a point to closed file objects raising an error when 
used in a 'with' block; if you're actually using the file you'll already 
get an error when you use its other methods, and if you're not actually 
using it, there's no point to the error, since close() is idempotent.)

So, at the C API level, I'm thinking something like Py_EnterResource(ob), 
that returns ob if ob has no tp_resource_enter slot defined, otherwise it 
returns the result of calling the method.  Similarly, some sort of 
Py_ExitResource() that guarantees an error return after invoking the 
tp_resource_exit slot (if any).

Finally, note that this extension now makes 'with' seem more like 'with' in 
other languages, because it is now just a scoped variable definition, with 
hooks for the object being scoped to be notified about entry and exit from 
scope.  It does mean that people encountering 'with some_expr()' (without 
an "as") may wonder about whether names inside the scope are somehow 
relative to 'some_expr', but it will probably become clear from context, 
especially via appropriate names.  For example 'with self.__locked' might 
provide that extra bit of clarity beyond 'with self.__lock'.


From pje at telecommunity.com  Mon May 16 19:22:03 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Mon, 16 May 2005 13:22:03 -0400
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <428826E5.8060102@canterbury.ac.nz>
References: <5.1.1.6.0.20050513204536.021d5900@mail.telecommunity.com>
	<5.1.1.6.0.20050513204536.021d5900@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050516131957.01ee69f8@mail.telecommunity.com>

At 04:51 PM 5/16/2005 +1200, Greg Ewing wrote:
>Phillip J. Eby wrote:
> > This makes it seem awkward for e.g. "do self.__lock", which doesn't
> > make any sense.  But the extra call needed to make it "do
> > locking(self.__lock)" seems sort of gratuitous.
>
>How about
>
>    do holding(self.__lock):

I simply mean that having to have any wrapper at all for common cases seems 
silly.


>It doesn't work so well when you don't already have an
>object with one obvious interpretation of what you want
>to do 'with' it, e.g. you have a pathname and you want
>to open a file.

Um, what's wrong with 'with open("filename") as f'?


>  I've already argued against giving file
>objects __enter__ and __exit__ methods. And I'm -42 on
>giving them to strings. :-)

If strings had them, __enter__ would return self, and __exit__ would do 
nothing.  I fail to see a problem.  :)


From pje at telecommunity.com  Mon May 16 19:25:06 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Mon, 16 May 2005 13:25:06 -0400
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4288282F.1050104@canterbury.ac.nz>
References: <ca471dc2050513205855dcba6e@mail.gmail.com>
	<ca471dc205051317133cf8fd63@mail.gmail.com>
	<5.1.1.6.0.20050513204536.021d5900@mail.telecommunity.com>
	<ca471dc2050513205855dcba6e@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050516132221.01edbeb0@mail.telecommunity.com>

At 04:57 PM 5/16/2005 +1200, Greg Ewing wrote:
>Guido van Rossum wrote:
>
> >>Also, one question: will the "do protocol" be added to built-in "resource"
> >>types?  That is, locks, files, sockets, and so on?
> >
> > One person proposed that and it was shot down by Greg Ewing. I think
> > it's better to require a separate wrapper.
>
>It depends on whether the resource is "reusable".

Why?  If "with" is a "scope statement", then it doesn't make any sense to 
use it with something you intend to reuse later.  The statement itself is 
an assertion that you intend to "release" the resource at the end of the 
block, for whatever "release" means to that object.  Releasing a file is 
obviously closing it, while releasing a lock is obviously unlocking it.


From pje at telecommunity.com  Mon May 16 19:35:01 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Mon, 16 May 2005 13:35:01 -0400
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4288899D.1060705@gmail.com>
References: <ca471dc205051517501b99e8b8@mail.gmail.com>
	<ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286F8CB.4090001@gmail.com>
	<d11dcfba050515071553dca64c@mail.gmail.com>
	<ca471dc205051517501b99e8b8@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050516132944.01ee2c98@mail.telecommunity.com>

At 09:53 PM 5/16/2005 +1000, Nick Coghlan wrote:

>My PEP punts on providing a general API for passing exceptions into 
>generators
>by making it an internal operation.

Actually, the proposals you made almost subsume PEPs 288 and 325.  All 
you'd need to do is:

1. move the '__del__' code to a 'close()' method and make __del__ call close()
2. make '_inject_exception()' a public API that returns the next yielded 
value if the exception doesn't propagate

And just like that you've cleaned up the open issues from both 288 and 325, 
IIRC those open issues correctly.

I personally think that StopIteration, TerminateIteration, 
KeyboardInterrupt and perhaps certain other exceptions should derive from 
some base class other than Exception (e.g. Raisable or some such) to help 
with the bare-except/except Exception problem.  But that's probably best 
addressed by a separate PEP.  :)


From gvanrossum at gmail.com  Mon May 16 20:20:04 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Mon, 16 May 2005 11:20:04 -0700
Subject: [Python-Dev] Simpler finalization semantics (was Re: PEP 343 -
	Abstract Block Redux)
In-Reply-To: <5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>
References: <5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>
Message-ID: <ca471dc2050516112014fb1b7f@mail.gmail.com>

Nick's PEP 3XX is here:
http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html

[Phillip J. Eby]
> Anyway, I took a look at it, and I mostly like it.

I like the beginning of Nick's PEP too, since its spec for the with
statement is now identical to PEP 343 apart from the keyword choice
(and I'm leaving that open for a public vote). The fundamental
difference now is that Nick proposes an exception injection and
finalization API for generators. I think that ought to be done as a
separate PEP, since it's relatively separate from the
do-or-with-statement (even though it's clear that PEP 343 would be the
first to benefit).

Maybe Nick can strip his PEP (or start a new one) to only addresses
generator exception injection and generator finalization. I think he's
on to something but it needs to be spec'ed more exactly. Continuing
with Phillip...:

> There appears to be an
> error in "Deterministic generator finalisation" (maybe you already know
> this): the _inject_exception() should be called with exc_info, not
> TerminateIteration, and it should swallow StopIteration instead of
> TerminateIteration.  IOW, I think it should look like this:
> 
>      def __exit__(self, *exc_info):
>          try:
>              self._inject_exception(*exc_info)
>          except StopIteration:
>              pass
> 
> Hm.  Oh wait, I just realized - you don't mean this at all.  You're
> describing a use of generators as non-templates.  Ugh.  I think that might
> lead to confusion about the semantics of 'with' and generators.  I'll have
> to think about it some more, but my second impression after a little bit of
> thought is that if you're going to do this, then you should be allowed to
> use 'with' with any object, using the object as VAR if there's no
> __enter__.  My reasoning here is that it then makes it possible for you to
> use arbitrary objects for 'with' without needing to know their
> implementation details.  It should be harmless to use 'with' on objects
> that don't need it.

I think the issue here is not implementation details but whether it
follows a certain protocol. IMO it's totally acceptable to require
that the expression used in a with-statement support an appropriate
protocol, just like we require the expression used in a for-loop to be
an iterable.

> This insight may actually be true regardless of what generators do or don't
> do; the point is that if you change from using a generator to a built-in
> iterator type, you shouldn't have to change every place you were using the
> 'with' blocks to work again.

Huh? The with-statement doesn't loop, and its use of generators is
such that I don't see how you could ever replace it with a built-in
iterator.

> A further part of this insight: perhaps the 'with' block translation should
> include a 'del VAR' in its finally block, not to mention the equivalent of
> 'del stmt_enter,stmt_exit'.  In other words, the binding of VAR should not
> escape the 'with' block.

I've seen Nick's stmt_enter and stmt_exit as implementation details
that probably would be done differently by a real implementation;
personally I don't mind getting the TypeError about a missing __exit__
only when __exit__ is called (after all if this happens the program is
too buggy to care much about the precise semantics) so this is what
PEP 343 does.

About deleting VAR I have mixed feelings. I appreciate the observation
that it's most likely dead after the with-statement is over, but I'm
not sure that explicit deletion is correct. Remember that VAR can be
an arbitrary assignment target, which means it could be a global, or a
tuple, or an indexed list or dict item, or some object's attribute (I
hesitate to write "etc." after such an exhaustive list :-). Example 8
in PEP 340 shows a use case where at least one of the variables (the
error value) could meaningfully survive the block. I think that, given
that we let the for-loop variable survive, we should treat the
with-statement variable the same way. A good compiler can see that
it's dead if it's unused further and delete it earlier, but I don't
think we need more -- especially since in a GC'ed implementation like
Jython or IronPython, deleting it doesn't do us much good.

> This would mean that for existing types that use
> __del__ for cleanup (e.g. files and sockets), then 'with open("file") as f'
> would automatically ensure closing under CPython (but other implementations
> would be allowed to wait for GC).  In other words, I'm saying that this:
> 
>       with some_expr() as foo:
>           # etc.
> 
> should also be short for this (in the case where some_expr() has no
> __enter__ or __exit__ methods):
> 
>       foo = some_expr()
>       try:
>           # etc.
>       finally:
>           del foo

-1 on this. You're trying to pack too much into a single statement.

> And that could be a useful thing for many existing object types, without
> even updating them for PEP 34[0-9].  :)  It wouldn't be *as* useful for
> non-CPython implementations, but presumably by the time those
> implementations catch up, more code will be out there with
> __enter__/__exit__ methods.

Most likely it would just give an implementation-dependent false sense
of security.

Also note that if you really need this, you can usually get the effect
with a del statement *without* a try/finally clause -- if an exception
is raised that you don't explicitly catch, the variable will go out of
scope anyway, so you only need the del upon normal completion of the
block.

> Also, by allowing a default __enter__ to exist
> (that returns self), many objects need only implement an __exit__.  (For
> example, I don't see a point to closed file objects raising an error when
> used in a 'with' block; if you're actually using the file you'll already
> get an error when you use its other methods, and if you're not actually
> using it, there's no point to the error, since close() is idempotent.)
> 
> So, at the C API level, I'm thinking something like Py_EnterResource(ob),
> that returns ob if ob has no tp_resource_enter slot defined, otherwise it
> returns the result of calling the method.  Similarly, some sort of
> Py_ExitResource() that guarantees an error return after invoking the
> tp_resource_exit slot (if any).
> 
> Finally, note that this extension now makes 'with' seem more like 'with' in
> other languages, because it is now just a scoped variable definition, with
> hooks for the object being scoped to be notified about entry and exit from
> scope.  It does mean that people encountering 'with some_expr()' (without
> an "as") may wonder about whether names inside the scope are somehow
> relative to 'some_expr', but it will probably become clear from context,
> especially via appropriate names.  For example 'with self.__locked' might
> provide that extra bit of clarity beyond 'with self.__lock'.

-1 on the whole thing (and noting that this would be an easy extension
if at some point in the future we change our mind).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From shane at hathawaymix.org  Mon May 16 20:29:08 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Mon, 16 May 2005 12:29:08 -0600
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <42883CBB.4080303@canterbury.ac.nz>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>	<ca471dc205051410435473d2b2@mail.gmail.com>	<4286521E.5010703@ocf.berkeley.edu>
	<42883CBB.4080303@canterbury.ac.nz>
Message-ID: <4288E674.502@hathawaymix.org>

Greg Ewing wrote:
> Brett C. wrote:
> 
> 
>>Nick's was obviously directly against looping, but, with no offense to Nick,
>>how many other people were against it looping?  It never felt like it was a
>>screaming mass with pitchforks but more of a "I don't love it, but I can deal"
>>crowd.
> 
> 
> My problem with looping was that, with it, the semantics
> of a block statement would be almost, but not quite,
> exactly like those of a for-loop, which seems to be
> flying in the face of TOOWTDI. And if it weren't for
> the can't-finalise-generators-in-a-for-loop backward
> compatibility problem, the difference would be even
> smaller.

I wonder if we should reconsider PEP 340, with one change: the block
iterator is required to iterate exactly once.  If it iterates more than
once or not at all, the interpreter raises a RuntimeError, indicating
the iterator can not be used as a block template.  With that change,
'break' and 'continue' will obviously affect 'for' and 'while' loops
rather than block statements.

Advantages of PEP 340, with this change, over PEP 343:

- we reuse a protocol rather than invent a new protocol.

- decorators aren't necessary.

- it's a step toward more general flow control macros.

At first I wasn't sure people would like the idea of requiring iterators
to iterate exactly once, but I just realized the other PEPs have the
same requirement.

Shane

From gvanrossum at gmail.com  Mon May 16 20:54:54 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Mon, 16 May 2005 11:54:54 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4286F8CB.4090001@gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286F8CB.4090001@gmail.com>
Message-ID: <ca471dc205051611542772aa9c@mail.gmail.com>

[Guido (responding to Fredrik Lundh's "intuitive -1" on PEP 343)]
> > Would it be better if we pulled back in the generator exit handling
> > from PEP 340? That's a pretty self-contained thing, and would let you
> > write try/finally around the yield.

[Nick Coghlan]
> That would be good, in my opinion. I updated PEP 3XX to use this idea:
> http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
> 
> With that update (to version 1.6), PEP 3XX is basically PEP 343, but injecting
> exceptions that occur into the template generator's internal frame instead of
> invoking next().

I'm in favor of the general idea, but would like to separate the error
injection and finalization API for generators into a separate PEP,
which would then compete with PEP 288 and PEP 325. I think the API
*should* be made public if it is available internally; I don't see any
implementation reasons why it would simplify the implementation if it
was only available internally.

Here are some issues to resolve in such a PEP.

- What does _inject_exception() return? It seems that it should raise
the exception that was passed into it, but I can't find this written
out explicitly.

- What should happen if a generator, in a finally or except clause
reached from _inject_exception(), executes another yield? I'm tempted
to make this a legitimate outcome (from the generator's perspective)
and reserve it for some future statement that implements the looping
semantics of PEP 340; the *_template decorator's wrapper class however
should consider it an error, just like other protocol mismatches like
not yielding the first time or yielding more than once in response to
next(). Nick's code in fact does all this right, Ijust think it should
be made explicit.

- TerminateIteration is a lousy name, since "terminate" means about
the same as "stop", so there could be legitimate confusion with
StopIteration. In PEP 340 I used StopIteration for this purpose, but
someone explained this was a poor choice since existing generators may
contain code that traps StopIteration for other purposes. Perhaps we
could use SystemExit for this purpose? Pretty much everybody is
supposed to let this one pass through since its purpose is only to
allow cleanup upon program (or thread) exit.

- I really don't like reusing __del__ as the method name for any kind
of destructor; __del__ has all sorts of special semantics (the GC
treats objects with a __del__ method specially).

- The all_lines() example jars me. Somehow it bugs me that its caller
has to remember to care about finalizing it. Maybe it's just not a
good example; I don't see what this offers over just writing the
obviously correct:

    for fn in filenames:
        with opening(fn) as f:
            for line in f:
                update_config(line)

even if using the template saves a line. I doubt the use case comes up
frequently enough to warrant abstracting it out. If I have to import
the template to save a line here, I just made my program a bit less
readable (since the first-time reader has to look up the definition of
the imported all_lines template) and I didn't even save a line unless
the template is used at least twice.

(Note that Nick's PEP contains two typos here -- it says "print lines"
where it should say "print line" and a bit later "print f" where again
it should say "print line".)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Mon May 16 21:12:13 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Mon, 16 May 2005 15:12:13 -0400
Subject: [Python-Dev] Simpler finalization semantics (was Re: PEP 343 -
 Abstract Block Redux)
In-Reply-To: <ca471dc2050516112014fb1b7f@mail.gmail.com>
References: <5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>
	<5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050516144609.028f6cc8@mail.telecommunity.com>

At 11:20 AM 5/16/2005 -0700, Guido van Rossum wrote:
>I think the issue here is not implementation details but whether it
>follows a certain protocol. IMO it's totally acceptable to require
>that the expression used in a with-statement support an appropriate
>protocol, just like we require the expression used in a for-loop to be
>an iterable.

Perhaps I didn't explain well.  I mean that whether early release of a 
resource is desirable, depends on the resource.  For example, consider a 
file versus a StringIO.  If files support resource release and StringIO's 
don't, this pollutes client code with implementation knowledge.

Therefore, the penalty for people trying to clean up resources early is 
that they either pollute their client code with checking to see if things 
are 'with'-able (which seems insane), or else adding empty __enter__ and 
__exit__ methods to things so that their consumers can just use "with" 
whenever they want to scope a resource, whether the resource actually needs 
it or not.

I'm suggesting that we simply take Nick's proposal to its logical 
conclusion, and allow any object to be usable under "with", since it does 
not create any problems to do so.  (At least, none that I can see.)  A 
redundant 'with' does no harm; in the worst case it's just a hint to the 
reader about the scope within which an expression is used within the 
current fuction body.


> > This insight may actually be true regardless of what generators do or don't
> > do; the point is that if you change from using a generator to a built-in
> > iterator type, you shouldn't have to change every place you were using the
> > 'with' blocks to work again.
>
>Huh? The with-statement doesn't loop, and its use of generators is
>such that I don't see how you could ever replace it with a built-in
>iterator.

I'm referring here to Nick's proposal of doing things like this:

     with some_function() as items:
         for item in items:
             # etc.

To ensure that a *normal* generator is finalized.  This isn't about 
generator templates.  So, if you were ever to change 'some_function()' to 
return a list instead of being a generator, for example, this code would no 
longer work.


>About deleting VAR I have mixed feelings. I appreciate the observation
>that it's most likely dead after the with-statement is over, but I'm
>not sure that explicit deletion is correct. Remember that VAR can be
>an arbitrary assignment target, which means it could be a global, or a
>tuple, or an indexed list or dict item, or some object's attribute (I
>hesitate to write "etc." after such an exhaustive list :-).

Argh.  I forgot about that.  :(  On the other hand, I haven't seen any 
examples where arbitrary assignment targets were actually used, so perhaps 
it could be limited to local variables (and possibly-nested tuples 
thereof).  But I also partly agree with the false-sense-of-security thing, 
too, so I'm not sure on this either now that you point out the issue.  It 
also makes me wonder about something like this:

     with open("foo") as f:
         def callback():
             return f.read()

which can't behave sanely if 'callback' is used outside the 'with' 
block.  Of course, it won't work right even if 'f' remains bound to the 
file object, but this does seem to open new implementation complexities to 
get all these pieces to work right.  :(


> > should also be short for this (in the case where some_expr() has no
> > __enter__ or __exit__ methods):
> >
> >       foo = some_expr()
> >       try:
> >           # etc.
> >       finally:
> >           del foo
>
>-1 on this. You're trying to pack too much into a single statement.

Note that I was just leaving out the rest of the standard PEP 3XX expansion 
above, just highlighting the proposed 'del' portion of the expansion.  That 
is, I wasn't proposing an alternate expansion, just showing the effect of 
the other parts of the expansion being skipped because of null 
__enter__/__exit__.


> > And that could be a useful thing for many existing object types, without
> > even updating them for PEP 34[0-9].  :)  It wouldn't be *as* useful for
> > non-CPython implementations, but presumably by the time those
> > implementations catch up, more code will be out there with
> > __enter__/__exit__ methods.
>
>Most likely it would just give an implementation-dependent false sense
>of security.

I think I prefer to see the glass half-full here; that is, I think it 
increases the safety during a transitional period.  But the scoping of 
arbitrary VAR expansions makes things trickier, so I'll think about this 
some more.


>-1 on the whole thing (and noting that this would be an easy extension
>if at some point in the future we change our mind).

Your post has made me think that I might have the concept 
backwards.  Instead of guaranteeing that the variable goes out of scope at 
the end of the block, what a 'with' block actually guarantees (implicit in 
all of the PEP 34x variants) is that EXPR will remain *alive* for the 
duration of the block, without needing to hold it in a variable 
anywhere.  That is:

     with foo():
         bar()

Guarantees that the result of 'foo()' will remain alive at least until the 
block is exited.

(I'm not saying this is anything new; it was implicit all along.  I'm just 
musing about ways to summarize or introduce the 'with' block to people.)


From dw at botanicus.net  Mon May 16 23:32:31 2005
From: dw at botanicus.net (David M. Wilson)
Date: Mon, 16 May 2005 22:32:31 +0100
Subject: [Python-Dev] RFC: rewrite fileinput module to use itertools.
Message-ID: <4289116F.9050002@botanicus.net>

Hi there,

Before charging on ahead with this small task, I was wondering if anyone 
would have any objections to such an upgrade. It seems to me that 
itertools.chain() could come in handy there, at least.

My only objection to the current implementation is that it doesn't 
reflect the current Python "best practice" for what it does, and it 
could be really simply made into a live demo of the itertools module.


David.

From python-dev at zesty.ca  Mon May 16 23:41:54 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Mon, 16 May 2005 16:41:54 -0500 (CDT)
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
Message-ID: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>

This PEP is a concrete proposal for exception chaining, to follow
up on its mention here on Python-Dev last week as well as earlier
discussions in the past year or two.

    http://www.python.org/peps/pep-0344.html

I've tried to summarize the applications for chaining mentioned in
these discussions, survey what's available in other languages, and
come up with a precise specification.

PEP 344 proposes three standard attributes on traceback objects:

    __context__ for implicit chaining (an unexpected exception
        occurred during 'except' or 'finally' processing)

    __cause__ for explicit chaining (intentional translation or
        augmenting of exceptions, set by "raise EXC from CAUSE")

    __traceback__ to point to the traceback

Hope this is useful.  I'd like your feedback.  Thanks!


-- Ping

From aahz at pythoncraft.com  Mon May 16 23:46:18 2005
From: aahz at pythoncraft.com (Aahz)
Date: Mon, 16 May 2005 14:46:18 -0700
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
Message-ID: <20050516214618.GA23741@panix.com>

On Mon, May 16, 2005, Ka-Ping Yee wrote:
>
> This PEP is a concrete proposal for exception chaining, to follow
> up on its mention here on Python-Dev last week as well as earlier
> discussions in the past year or two.
> 
>     http://www.python.org/peps/pep-0344.html
> 
> I've tried to summarize the applications for chaining mentioned in
> these discussions, survey what's available in other languages, and
> come up with a precise specification.
> 
> PEP 344 proposes three standard attributes on traceback objects:
> 
>     __context__ for implicit chaining (an unexpected exception
>         occurred during 'except' or 'finally' processing)
> 
>     __cause__ for explicit chaining (intentional translation or
>         augmenting of exceptions, set by "raise EXC from CAUSE")
> 
>     __traceback__ to point to the traceback
> 
> Hope this is useful.  I'd like your feedback.  Thanks!

I'll comment here in hopes of staving off responses from multiple
people: I don't think these should be double-underscore attributes.  The
currently undocumented ``args`` attribute isn't double-underscore, and I
think that's precedent to be followed.
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"And if that makes me an elitist...I couldn't be happier."  --JMS

From rowen at cesmail.net  Mon May 16 23:58:33 2005
From: rowen at cesmail.net (Russell E. Owen)
Date: Mon, 16 May 2005 14:58:33 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
References: <ca471dc205051517501b99e8b8@mail.gmail.com>
	<ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286F8CB.4090001@gmail.com>
	<d11dcfba050515071553dca64c@mail.gmail.com>
	<ca471dc205051517501b99e8b8@mail.gmail.com>
	<4288899D.1060705@gmail.com>
	<5.1.1.6.0.20050516132944.01ee2c98@mail.telecommunity.com>
Message-ID: <rowen-A0D9D2.14583316052005@sea.gmane.org>

In article <5.1.1.6.0.20050516132944.01ee2c98 at mail.telecommunity.com>,
 "Phillip J. Eby" <pje at telecommunity.com> wrote:

>...
> I personally think that StopIteration, TerminateIteration, 
> KeyboardInterrupt and perhaps certain other exceptions should derive from 
> some base class other than Exception (e.g. Raisable or some such) to help 
> with the bare-except/except Exception problem.  But that's probably best 
> addressed by a separate PEP.  :)

Yes, please!!!

I am so sick of writing:
    except (SystemExit, KeyboardInterrupt):
        raise
    except Exception, e:
       # handle a "real" error

but back to lurking on this discussion...

-- Russell


From martin at v.loewis.de  Tue May 17 00:03:24 2005
From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Tue, 17 May 2005 00:03:24 +0200
Subject: [Python-Dev] RFC: rewrite fileinput module to use itertools.
In-Reply-To: <4289116F.9050002@botanicus.net>
References: <4289116F.9050002@botanicus.net>
Message-ID: <428918AC.6020507@v.loewis.de>

David M. Wilson wrote:
> Before charging on ahead with this small task, I was wondering if anyone 
> would have any objections to such an upgrade. It seems to me that 
> itertools.chain() could come in handy there, at least.
> 
> My only objection to the current implementation is that it doesn't 
> reflect the current Python "best practice" for what it does, and it 
> could be really simply made into a live demo of the itertools module.

I personally would not care, but if you have fun doing that, go ahead.
The major requirement would be that the result is 100% (better 150%)
compatible with the old code, even in border cases. I don't know
whether this is possible at all, from experience, I would guess
that there is a significant chance of breaking something.

Regards,
Martin

From gvanrossum at gmail.com  Tue May 17 01:11:36 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Mon, 16 May 2005 16:11:36 -0700
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
Message-ID: <ca471dc20505161611497a25b7@mail.gmail.com>

[Ka-Ping Yee]
> This PEP is a concrete proposal for exception chaining, to follow
> up on its mention here on Python-Dev last week as well as earlier
> discussions in the past year or two.
> 
>     http://www.python.org/peps/pep-0344.html

Here's a bunch of commentary:

You're not giving enough credit to Java, which has the "cause" part
nailed IMO.

I like the motivation and rationale, but I think the spec is weak and
would benefit from a better understanding of how exception handling
currently works.  In particular, please read and understand the
comment in ceval.c starting with this line:

    /* Implementation notes for set_exc_info() and reset_exc_info():

There's too much text devoted early on to examples.  I think these
should come at the end; in particular, hiding the proposed semantics
at the very end of a section that otherwise contains only illustrative
examples is a bad idea (especially since the examples are easy enough
to guess, if you've read the rationale).

I don't think the PEP adequately predicts what should happen in this
example:

    def foo():
	try:
	    1/0  # raises ZeroDivisionError
	except:
	    bar()
	    raise sys.does_not_exist  # raises AttributeError

    def bar():
	try:
	    1+""  # raises TypeError
	except TypeError:
	    pass

Intuitively, the AttributeError should have the ZeroDivisionError as
its __context__, but I think the clearing of the thread's exception
context when the except clause in bar() is left will drop the
exception context.  If you follow the save/restore semantics described
in that ceval.c comment referenced above, you'll get the correct
semantics, I believe.

Also, in that same example, according to your specs, the TypeError
raised by bar() has the ZeroDivisionError raised in foo() as its
context.  Do we really want this?  I still have the feeling that
perhaps the context ought to be attached later, e.g. only when an
exception "passes through" a frame that happens to be handling an
exception already (whether in an except clause or in a finally
clause).

When chaining exceptions, I think it should be an error if the cause
is not an exception instance (or None).  Yes, this will just
substitute a different exception, but I still think it's the right
thing to do -- otherwise code walking the chain of causes must be
constantly aware of this possibility, and since during normal use it
will never happen, that would be a great way to trip it up (perhaps
even to cause a circularity!).

Do we really need both __context__ and __cause__?  Methinks that you
only ever need one: either you explicitly chain a new exception to a
cause, and then the context is probably the same or irrelevant, or you
don't explicitly chain, and then cause is absent.  Since the traceback
printing code is to prefer __cause__ over __context__, why can't we
unify these?  About the only reason I can think of is that with
__cause__ you know it was intentional and with __context__ you know it
wasn't; but when is it important knowing the difference?

Do we really need new syntax to set __cause__?  Java does this without
syntax by having a standard API initCause() (as well as constructors
taking a cause as argument; I understand why you don't want to rely on
that -- neither does Java).  That seems more general because it can be
used outside the context of a raise statement.

Why insert a blank line between chained tracebacks?

In Java, I often find the way chained tracebacks are printed
confusing, because the "deepest" stack frame (where the exception
originally occurred) is no longer at the top of the printout.  I
expect the same confusion to happen for Python, since it prints
everything in the exact opposite order as Java does, so again the
original exception is somewhere in the middle.  I don't think I want
to fix this by printing the outermost exception first and the chained
exception later (which would keep the stack frames in their proper
order but emphasizes the low-level exception rather than the one that
matches the except clause that would have caught it at the outermost
level), but I might want to add an extra line at the very end (and
perhaps at each chaining point) warning the user that the exception
has a chained counterpart that was printed earlier.

Why should the C level APIs not automatically set __context__?  (There
may be an obvious reason but it doesn't hurt stating it.)  You're
unclear on how the C code should be modified to ensure that the proper
calls to PyErr_SetContext() are made.

I was surprised to learn that yield clears the exception state; I
wonder if this isn't a bug in the generator implementation?  IMO
better semantics would be for the exception state to survive across
yield.

You should at least mention what should happen to string exceptions,
even if (as I presume) the only sane approach is not to support this
for string exceptions (a string exception may be the end of the chain,
but it cannot have a chained exception itself).

I don't like the example (in "Open Issues") of applications wrapping
arbitrary exceptions in ApplicationError.  I consider this bad style,
even if the chaining makes it not quite as bad as it used to be.

I don't see the need for "except *, exc" -- assuming all exceptions
derive from a single base class, we can just write the name of that
base class.

I don't like having sys.exception; instead, the only way to access the
"current" exception ought to be to use an except clause with a
variable.  (sys.last_exception is fine.)

I like the idea of taking all APIs that currently require a (type,
value, traceback) triple to *also* accept a single exception instance.

You should probably reference the proposal (pending a PEP; I think
Brett is working on it?) that all exceptions should eventually derive
from a common base class (a la Throwable in Java).

I hope that this can be accepted together with the son-of-PEP-343 (PEP
343 plus generator exception injection and finalization) so __exit__
can take a single exception argument from the start.  (But what should
it receive if a string exception is being caught?  A triple perhaps?)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From bac at OCF.Berkeley.EDU  Tue May 17 01:48:42 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Mon, 16 May 2005 16:48:42 -0700
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <ca471dc20505161611497a25b7@mail.gmail.com>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<ca471dc20505161611497a25b7@mail.gmail.com>
Message-ID: <4289315A.3030406@ocf.berkeley.edu>

Guido van Rossum wrote:
[SNIP - bunch of points from Guido]
> Do we really need both __context__ and __cause__?  Methinks that you
> only ever need one: either you explicitly chain a new exception to a
> cause, and then the context is probably the same or irrelevant, or you
> don't explicitly chain, and then cause is absent.  Since the traceback
> printing code is to prefer __cause__ over __context__, why can't we
> unify these?  About the only reason I can think of is that with
> __cause__ you know it was intentional and with __context__ you know it
> wasn't; but when is it important knowing the difference?
> 

I am with Guido.  I don't think the need to know if an exception was chained
explicitly or implicitly will be important enough to warrant a separate
attribute.  And if people care that much that can tack on a random attribute
like explicit_chain or something to the exception on their own.

[SNIP]
> In Java, I often find the way chained tracebacks are printed
> confusing, because the "deepest" stack frame (where the exception
> originally occurred) is no longer at the top of the printout.  I
> expect the same confusion to happen for Python, since it prints
> everything in the exact opposite order as Java does, so again the
> original exception is somewhere in the middle.  I don't think I want
> to fix this by printing the outermost exception first and the chained
> exception later (which would keep the stack frames in their proper
> order but emphasizes the low-level exception rather than the one that
> matches the except clause that would have caught it at the outermost
> level), but I might want to add an extra line at the very end (and
> perhaps at each chaining point) warning the user that the exception
> has a chained counterpart that was printed earlier.
> 

Just a simple "[chained exception]" note or something?  Sounds good.

[SNIP]
> You should probably reference the proposal (pending a PEP; I think
> Brett is working on it?)

My plan is to write the "Exceptions in Python 3000" PEP shortly after I start
my internship.  Going to put that at a higher priority than the AST branch to
make sure I get to it some time before I leave the country.  =)

-Brett

From gvanrossum at gmail.com  Tue May 17 01:52:49 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Mon, 16 May 2005 16:52:49 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <428811A8.9000706@ronadam.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286F8CB.4090001@gmail.com> <428811A8.9000706@ronadam.com>
Message-ID: <ca471dc20505161652661f31ef@mail.gmail.com>

[Ron Adam]
> So I was wondering if something like the following is feasible?
> 
[...]
> 
> with opening(file1,m),opening(file2,m),opening(file3,m) as f1,f2,f3:
>      # do stuff with files
> 
> The 'with' (or whatever) statement would need a little more under the
> hood, but it might simplify handling multiple resources.
> 
> This also reduces nesting in cases such as locking and opening. Both
> must succeed before the block executes. And if something goes wrong, the
> "with" statement knows and can handle each resource.  The point is, each
> resource needs to be a whole unit, opening multiple files in one
> resource handler is probably not a good idea anyway.

I'm -0 on this, if only because it complicates things a fair bit for a
very minor improvement in functionality. There are also some
semantic/syntactic questions: should this work only if there are
explicit commas in the with-statement (so the compiler can generate
code equivalent to nested with-statements) or should it allow a single
expression to return a tuple of resource managers dynamically (so the
run-time must check fora tuple each time)?

It's always something we can add in the future, since it's guaranteed
that this syntax (or a tuple value) is invalid in the curernt
proposal. So I'd rather punt on this.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From jack at performancedrivers.com  Tue May 17 02:21:12 2005
From: jack at performancedrivers.com (Jack Diederich)
Date: Mon, 16 May 2005 20:21:12 -0400
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <42883CBB.4080303@canterbury.ac.nz>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286521E.5010703@ocf.berkeley.edu>
	<42883CBB.4080303@canterbury.ac.nz>
Message-ID: <20050517002112.GH20441@performancedrivers.com>

On Mon, May 16, 2005 at 06:24:59PM +1200, Greg Ewing wrote:
> Brett C. wrote:
> 
> > Nick's was obviously directly against looping, but, with no offense to Nick,
> > how many other people were against it looping?  It never felt like it was a
> > screaming mass with pitchforks but more of a "I don't love it, but I can deal"
> > crowd.
> 
> My problem with looping was that, with it, the semantics
> of a block statement would be almost, but not quite,
> exactly like those of a for-loop, which seems to be
> flying in the face of TOOWTDI. And if it weren't for
> the can't-finalise-generators-in-a-for-loop backward
> compatibility problem, the difference would be even
> smaller.

Nodders, the looping construct seemed to work out fine as code people
could use to get their heads around the idea.  It was eye-gougingly bad
as final solution.

Forcing people to write an iterator for something that will almost never
loop is as awkward as forcing everyone to write "if" statements as

for dummy in range(1):
  if (somevar):
    do_true_stuff()
    break
else:
  do_false_stuff()

I still haven't gotten used to Guido's heart-attack inducing early 
enthusiasm for strange things followed later by a simple proclamation
I like.  Some day I'll learn that the sound of fingernails on the
chalkboard is frequently followed by candy for the whole class.  
For now the initial stages still give me the shivers.

-jackdied

From pje at telecommunity.com  Tue May 17 03:05:51 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Mon, 16 May 2005 21:05:51 -0400
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <20050517002112.GH20441@performancedrivers.com>
References: <42883CBB.4080303@canterbury.ac.nz>
	<ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286521E.5010703@ocf.berkeley.edu>
	<42883CBB.4080303@canterbury.ac.nz>
Message-ID: <5.1.1.6.0.20050516210514.01ff7050@mail.telecommunity.com>

At 08:21 PM 5/16/2005 -0400, Jack Diederich wrote:
>I still haven't gotten used to Guido's heart-attack inducing early
>enthusiasm for strange things followed later by a simple proclamation
>I like.  Some day I'll learn that the sound of fingernails on the
>chalkboard is frequently followed by candy for the whole class.

Heh.  +1 for QOTW.  :)


From python-dev at zesty.ca  Tue May 17 03:09:54 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Mon, 16 May 2005 20:09:54 -0500 (CDT)
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <20050516214618.GA23741@panix.com>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<20050516214618.GA23741@panix.com>
Message-ID: <Pine.LNX.4.58.0505161956580.14555@server1.LFW.org>

On Mon, 16 May 2005, Aahz wrote:
> I'll comment here in hopes of staving off responses from multiple
> people: I don't think these should be double-underscore attributes.  The
> currently undocumented ``args`` attribute isn't double-underscore, and I
> think that's precedent to be followed.

That isn't the criterion i'm using, though.  Here's my criterion, and
maybe then we can talk about what the right criterion should be:

    System attributes are for protocols defined by the language.

(I'm using the term "system attribute" here to mean "an attribute with
a double-underscore name", which i picked up from something Guido
wrote a while back [1].)

For example, __init__, __add__, __file__, __name__, etc. are all
attributes whose meaning is defined by the language itself as opposed
to the Python library.  A good indicator of this is the fact that
their names are hardcoded into the Python VM.  I reasoned that
__cause__, __context__, and __traceback__ should also be system
attributes since their meaning is defined by Python.

Exceptions are just classes; they're intended to be extended in
arbitrary application-specific ways.  It seemed a good idea to leave
that namespace open.


-- ?!ng

[1] http://mail.python.org/pipermail/python-dev/2003-June/036239.html

From jack at performancedrivers.com  Tue May 17 04:11:47 2005
From: jack at performancedrivers.com (Jack Diederich)
Date: Mon, 16 May 2005 22:11:47 -0400
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <Pine.LNX.4.58.0505161956580.14555@server1.LFW.org>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<20050516214618.GA23741@panix.com>
	<Pine.LNX.4.58.0505161956580.14555@server1.LFW.org>
Message-ID: <20050517021147.GL20441@performancedrivers.com>

On Mon, May 16, 2005 at 08:09:54PM -0500, Ka-Ping Yee wrote:
> On Mon, 16 May 2005, Aahz wrote:
> > I'll comment here in hopes of staving off responses from multiple
> > people: I don't think these should be double-underscore attributes.  The
> > currently undocumented ``args`` attribute isn't double-underscore, and I
> > think that's precedent to be followed.
> 
> That isn't the criterion i'm using, though.  Here's my criterion, and
> maybe then we can talk about what the right criterion should be:
> 
>     System attributes are for protocols defined by the language.
> 
> (I'm using the term "system attribute" here to mean "an attribute with
> a double-underscore name", which i picked up from something Guido
> wrote a while back [1].)
> 
> For example, __init__, __add__, __file__, __name__, etc. are all
> attributes whose meaning is defined by the language itself as opposed
> to the Python library.  A good indicator of this is the fact that
> their names are hardcoded into the Python VM.  I reasoned that
> __cause__, __context__, and __traceback__ should also be system
> attributes since their meaning is defined by Python.
> 
> Exceptions are just classes; they're intended to be extended in
> arbitrary application-specific ways.  It seemed a good idea to leave
> that namespace open.

I prefer trichomomies over dichotomies, but whether single or double 
underscores are "the bad" or "the ugly" I'll leave to others.  In python
double underscores can only mean "I don't handle this, my class does" or
"I'm a C++ weenie, can I pretend this is private?"

Excluding the "private" non-argument the only question is where it goes 
in the class hierarchy.  Is it a property you would normally associate
with the instance, the class of an instance, or the class of a class (type).

To me it feels like a property of the instance.  The values are never
shared by expections of the class so just make it a plain variable to remind
other people of that too.

-jackdied

From jack at performancedrivers.com  Tue May 17 04:19:11 2005
From: jack at performancedrivers.com (Jack Diederich)
Date: Mon, 16 May 2005 22:19:11 -0400
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <20050517021147.GL20441@performancedrivers.com>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<20050516214618.GA23741@panix.com>
	<Pine.LNX.4.58.0505161956580.14555@server1.LFW.org>
	<20050517021147.GL20441@performancedrivers.com>
Message-ID: <20050517021911.GM20441@performancedrivers.com>

On Mon, May 16, 2005 at 10:11:47PM -0400, Jack Diederich wrote:
> The values are never shared by expections of the class
                                 ^^^^^^^^^^
s/expect/except/

Exceptions are expected by except statements - and ispell can't tell the difference.

-jackdied

From shane at hathawaymix.org  Tue May 17 05:04:35 2005
From: shane at hathawaymix.org (Shane Hathaway)
Date: Mon, 16 May 2005 21:04:35 -0600
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc20505161652661f31ef@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>	<ca471dc205051410435473d2b2@mail.gmail.com>	<4286F8CB.4090001@gmail.com>
	<428811A8.9000706@ronadam.com>
	<ca471dc20505161652661f31ef@mail.gmail.com>
Message-ID: <42895F43.9060401@hathawaymix.org>

Guido van Rossum wrote:
> [Ron Adam]
> 
>>with opening(file1,m),opening(file2,m),opening(file3,m) as f1,f2,f3:
>>     # do stuff with files
>>
> 
> I'm -0 on this, if only because it complicates things a fair bit for a
> very minor improvement in functionality. [...]
> It's always something we can add in the future, since it's guaranteed
> that this syntax (or a tuple value) is invalid in the curernt
> proposal. So I'd rather punt on this.

Also, you can already get 90% of this with the combining() wrapper I
posted earlier.

    with combining(opening(file1,m),opening(file2,m),opening(file3,m)
        ) as f1,f2,f3:
        # do stuff with files

Shane

From greg.ewing at canterbury.ac.nz  Tue May 17 05:20:19 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Tue, 17 May 2005 15:20:19 +1200
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <5.1.1.6.0.20050516132221.01edbeb0@mail.telecommunity.com>
References: <ca471dc2050513205855dcba6e@mail.gmail.com>
	<ca471dc205051317133cf8fd63@mail.gmail.com>
	<5.1.1.6.0.20050513204536.021d5900@mail.telecommunity.com>
	<ca471dc2050513205855dcba6e@mail.gmail.com>
	<5.1.1.6.0.20050516132221.01edbeb0@mail.telecommunity.com>
Message-ID: <428962F3.70006@canterbury.ac.nz>

Phillip J. Eby wrote:

> Why?  If "with" is a "scope statement", then it doesn't make any sense 
> to use it with something you intend to reuse later.  The statement 
> itself is an assertion that you intend to "release" the resource at the 
> end of the block, for whatever "release" means to that object.  
> Releasing a file is obviously closing it, while releasing a lock is 
> obviously unlocking it.

I've stopped arguing against giving with-protocol
methods to files, etc., since it was pointed out that
you'll get an exception if you try to re-use one.

I still think it's conceptually cleaner if the object
you use to access the resource is created by the
__enter__ method rather than being something pre-
existing, but I'm willing to concede that PBP here.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From gvanrossum at gmail.com  Tue May 17 06:12:33 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Mon, 16 May 2005 21:12:33 -0700
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <20050517021147.GL20441@performancedrivers.com>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<20050516214618.GA23741@panix.com>
	<Pine.LNX.4.58.0505161956580.14555@server1.LFW.org>
	<20050517021147.GL20441@performancedrivers.com>
Message-ID: <ca471dc2050516211225b24436@mail.gmail.com>

[Jack Diederich]
> I prefer trichomomies over dichotomies, but whether single or double
> underscores are "the bad" or "the ugly" I'll leave to others.  In python
> double underscores can only mean "I don't handle this, my class does" or
> "I'm a C++ weenie, can I pretend this is private?"
>
> Excluding the "private" non-argument the only question is where it goes
> in the class hierarchy.  Is it a property you would normally associate
> with the instance, the class of an instance, or the class of a class (type).
> 
> To me it feels like a property of the instance.  The values are never
> shared by expections of the class so just make it a plain variable to remind
> other people of that too.

Can't tell if you're just trying to be funny, but that sounds like
nothing remotely like the rule I use in my head to decide whether to
make something a system attribute or a regular attribute.

My rule has more to do with who "owns" the namespace on the one hand,
and with "magic" behavior caused (or indicated) by the presence of the
attribute on the other. Class or instance is irrelevant; that most
magic attributes live on classes or modules is just because those are
places where most of the magic is concentrated.

__init__ in a class is a system attribute because it has a magic
meaning (invoked automatically on instantiation). __file__ and
__name__ in a module (and __module__ and __name__ in a class!) are
system attributes because they are "imposing" on the user's use of the
namespace. (Note: next was a mistake; it should have been __next__
because of the "magic" rule.)

Unfortunately I can't quite decide whether either rule applies in the
case of exceptions. I think it's totally plausible to say "all
exceptions derive from Throwable, which predefines the following
attributes: traceback, cause". OTOH making them system attributes is
more backwards compatible.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Tue May 17 06:16:47 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Mon, 16 May 2005 21:16:47 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <428962F3.70006@canterbury.ac.nz>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<5.1.1.6.0.20050513204536.021d5900@mail.telecommunity.com>
	<ca471dc2050513205855dcba6e@mail.gmail.com>
	<5.1.1.6.0.20050516132221.01edbeb0@mail.telecommunity.com>
	<428962F3.70006@canterbury.ac.nz>
Message-ID: <ca471dc205051621163cd74301@mail.gmail.com>

[Greg Ewing]
> I've stopped arguing against giving with-protocol
> methods to files, etc., since it was pointed out that
> you'll get an exception if you try to re-use one.
> 
> I still think it's conceptually cleaner if the object
> you use to access the resource is created by the
> __enter__ method rather than being something pre-
> existing, but I'm willing to concede that PBP here.

PBP? Google finds "Python Browser Poseur" but no definition of IRC
slang. Proven Best Practice? Pakistani Border Patrol? Puffing Billy
Posse?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From ncoghlan at gmail.com  Tue May 17 11:35:57 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 17 May 2005 19:35:57 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <5.1.1.6.0.20050516210514.01ff7050@mail.telecommunity.com>
References: <42883CBB.4080303@canterbury.ac.nz>	<ca471dc205051317133cf8fd63@mail.gmail.com>	<d64imp$fai$1@sea.gmane.org>	<ca471dc205051410435473d2b2@mail.gmail.com>	<4286521E.5010703@ocf.berkeley.edu>	<42883CBB.4080303@canterbury.ac.nz>
	<5.1.1.6.0.20050516210514.01ff7050@mail.telecommunity.com>
Message-ID: <4289BAFD.3010507@gmail.com>

Phillip J. Eby wrote:
> At 08:21 PM 5/16/2005 -0400, Jack Diederich wrote:
> 
>>I still haven't gotten used to Guido's heart-attack inducing early
>>enthusiasm for strange things followed later by a simple proclamation
>>I like.  Some day I'll learn that the sound of fingernails on the
>>chalkboard is frequently followed by candy for the whole class.
> 
> 
> Heh.  +1 for QOTW.  :)

Indeed. Particularly since it sounds like son-of-PEP-343 will be the union of 
PEP 310 and PEP 340 that I've been trying to figure out :)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From ncoghlan at gmail.com  Tue May 17 11:59:42 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 17 May 2005 19:59:42 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc205051621163cd74301@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	<5.1.1.6.0.20050513204536.021d5900@mail.telecommunity.com>	<ca471dc2050513205855dcba6e@mail.gmail.com>	<5.1.1.6.0.20050516132221.01edbeb0@mail.telecommunity.com>	<428962F3.70006@canterbury.ac.nz>
	<ca471dc205051621163cd74301@mail.gmail.com>
Message-ID: <4289C08E.4080705@gmail.com>

Guido van Rossum wrote:
> [Greg Ewing]
> 
>>I still think it's conceptually cleaner if the object
>>you use to access the resource is created by the
>>__enter__ method rather than being something pre-
>>existing, but I'm willing to concede that PBP here.
> 
> 
> PBP? Google finds "Python Browser Poseur" but no definition of IRC
> slang. Proven Best Practice? Pakistani Border Patrol? Puffing Billy
> Posse?

Previously Belaboured Point? (Just guessing from context here, but if I'm right, 
  that's one acronym I'm going to have to remember. . .)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From ncoghlan at gmail.com  Tue May 17 12:09:11 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 17 May 2005 20:09:11 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <ca471dc205051611542772aa9c@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	
	<d64imp$fai$1@sea.gmane.org>	
	<ca471dc205051410435473d2b2@mail.gmail.com>	
	<4286F8CB.4090001@gmail.com>
	<ca471dc205051611542772aa9c@mail.gmail.com>
Message-ID: <4289C2C7.2000508@gmail.com>

Guido van Rossum wrote:
> [Nick Coghlan]
> 
>>That would be good, in my opinion. I updated PEP 3XX to use this idea:
>>http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html
>>
>>With that update (to version 1.6), PEP 3XX is basically PEP 343, but injecting
>>exceptions that occur into the template generator's internal frame instead of
>>invoking next().
> 
> I'm in favor of the general idea, but would like to separate the error
> injection and finalization API for generators into a separate PEP,
> which would then compete with PEP 288 and PEP 325.

Without that it pretty much devolves into the current version of PEP 343, though 
(as far as I can tell, the two PEP's now agree on the semantics of with statements)

> I think the API
> *should* be made public if it is available internally; I don't see any
> implementation reasons why it would simplify the implementation if it
> was only available internally.

If it's internal, we don't need to name it immediately - working out the public 
API can then be decoupled from the ability to use generators to write resource 
managers.

> Here are some issues to resolve in such a PEP.
> 
> - What does _inject_exception() return? It seems that it should raise
> the exception that was passed into it, but I can't find this written
> out explicitly.

That's a good point. The intent was for it to be equivalent to the exception 
being reraised at the point of the last yield. At that point, control flows like 
it would for a call to next() with code inside the generator that looked like:

   yield
   exc_type, value, tb = _passed_in_exception()
   raise exc_type, value, tb

> - What should happen if a generator, in a finally or except clause
> reached from _inject_exception(), executes another yield? I'm tempted
> to make this a legitimate outcome (from the generator's perspective)
> and reserve it for some future statement that implements the looping
> semantics of PEP 340; the *_template decorator's wrapper class however
> should consider it an error, just like other protocol mismatches like
> not yielding the first time or yielding more than once in response to
> next(). Nick's code in fact does all this right, Ijust think it should
> be made explicit.

Yep, that was the intent - you're correct that describing it in the text as well 
would make it clear that this is deliberate.

> - TerminateIteration is a lousy name, since "terminate" means about
> the same as "stop", so there could be legitimate confusion with
> StopIteration. In PEP 340 I used StopIteration for this purpose, but
> someone explained this was a poor choice since existing generators may
> contain code that traps StopIteration for other purposes. Perhaps we
> could use SystemExit for this purpose? Pretty much everybody is
> supposed to let this one pass through since its purpose is only to
> allow cleanup upon program (or thread) exit.

Wouldn't that mean we run the risk of suppressing a *real* SystemExit if it 
occurs while a generator is being finalised?

Perhaps a new exception IteratorExit, which is a subclass of SystemExit. Then 
well-behaved code wouldn't trap it accidentally, and the finalisation code wouldn't?

> - I really don't like reusing __del__ as the method name for any kind
> of destructor; __del__ has all sorts of special semantics (the GC
> treats objects with a __del__ method specially).

I guess if file objects can live without a __del__ method to automatically 
close, generators that require finalisation can survive without it.

> - The all_lines() example jars me. Somehow it bugs me that its caller
> has to remember to care about finalizing it.

The alternative is to have some form of automatic finalisation in for loops, or 
else follow up on PJE's idea of making with statements allow pretty much 
*anything* to be used as VAR1.

Automatic finalisation (like that now described in the Rejected Options section 
of my PEP) makes iterators/generators that manage resources 'just work' (nice) 
but complicates the semantics of generators (bad) and for loops (bad).

The current version of my PEP means you need to know that a particular 
generator/iterator needs finalisation, and rearrange code to cope with that 
(bad), but keeps for loops simple (nice).

PJE's idea about a permissive with statement may actually give the best of both 
worlds - you can *always* use the with statement, and if the supplied object 
doesn't need cleaning up, there's no more overhead than checking for the 
existence of methods in a couple of slots. (See my answer to Phillip on that 
topic for the gory details)

> Maybe it's just not a
> good example;

I was having trouble thinking of a case where it made sense for the generator to 
manage its own resources. I agree all_lines isn't a great example, but its a 
safe bet that things like it will be written once the restriction on yielding 
inside try/finally is lifted.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From anothermax at gmail.com  Tue May 17 13:25:32 2005
From: anothermax at gmail.com (Jeremy Maxfield)
Date: Tue, 17 May 2005 13:25:32 +0200
Subject: [Python-Dev] Multiple interpreters not compatible with current
	thread module
Message-ID: <93dc9c320505170425392e3d80@mail.gmail.com>

The current threadmodule.c does not seem to correctly support multiple
(sub) interpreters.

This became apparent when using jep - (a Java/Python bridge) and also
seems to cause problems with mod_python.

The incompatibility began in Python version 2.3.5 and has been traced to changes
to the 2.4 threadmodule.c that were backported to the 2.3 delivery.
A bug report was raised on 2005-03-15
(http://sourceforge.net/tracker/index.php?func=detail&aid=1163563&group_id=5470&atid=105470)
which covers the problem in more detail.

I've just submitted a patch (I hope it's correctly formatted)  for
threadmodule.c
(http://sourceforge.net/tracker/index.php?func=detail&aid=1203393&group_id=5470&atid=305470)
adapted from the pre-2.3.5 threadmodule.c (replacing the
PyGILState_XXX calls with those from the earlier thread module).
The patch works correctly but it will probably re-introduce the
problem that the change for threadmodule.c version 2.59 fixed.("Fix
for [ 1010677 ] thread Module Breaks PyGILState_Ensure(),and a test
case.When booting a new thread, use the PyGILState API to manage the
GIL.").
The documentation (http://docs.python.org/api/threads.html) states
"Note that the PyGILState_*() functions assume there is only one
global interpreter (created automatically by Py_Initialize()). Python
still supports the creation of additional interpreters (using
Py_NewInterpreter()), but mixing multiple interpreters and the
PyGILState_*() API is unsupported. ", so it looks like that using the
PyGilState_XXX functions in the core threadmodule.c means the
Py_NewInterpreter() call (i.e. multiple interpreters) is no longer
supported when threads are involved.

This problem is preventing us upgrading to Python 2.4 so we'd
obviously like to see  a resolution the next Python release if that
were possible...

Cheers,
Max

From p.f.moore at gmail.com  Tue May 17 14:11:16 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 17 May 2005 13:11:16 +0100
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4289C08E.4080705@gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<5.1.1.6.0.20050513204536.021d5900@mail.telecommunity.com>
	<ca471dc2050513205855dcba6e@mail.gmail.com>
	<5.1.1.6.0.20050516132221.01edbeb0@mail.telecommunity.com>
	<428962F3.70006@canterbury.ac.nz>
	<ca471dc205051621163cd74301@mail.gmail.com>
	<4289C08E.4080705@gmail.com>
Message-ID: <79990c6b05051705112dea817e@mail.gmail.com>

On 5/17/05, Nick Coghlan <ncoghlan at gmail.com> wrote:
> Guido van Rossum wrote:
> > [Greg Ewing]
> >
> >>I still think it's conceptually cleaner if the object
> >>you use to access the resource is created by the
> >>__enter__ method rather than being something pre-
> >>existing, but I'm willing to concede that PBP here.
> >
> >
> > PBP? Google finds "Python Browser Poseur" but no definition of IRC
> > slang. Proven Best Practice? Pakistani Border Patrol? Puffing Billy
> > Posse?
> 
> Previously Belaboured Point? (Just guessing from context here, but if I'm right,
>  that's one acronym I'm going to have to remember. . .)

Practicality Beats Purity, surely...?
Paul

From ncoghlan at gmail.com  Tue May 17 15:03:40 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 17 May 2005 23:03:40 +1000
Subject: [Python-Dev] Simpler finalization semantics (was Re: PEP 343 -
 Abstract Block Redux)
In-Reply-To: <5.1.1.6.0.20050516144609.028f6cc8@mail.telecommunity.com>
References: <5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>
	<5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>
	<5.1.1.6.0.20050516144609.028f6cc8@mail.telecommunity.com>
Message-ID: <4289EBAC.1020905@gmail.com>

Phillip J. Eby wrote:
> I'm suggesting that we simply take Nick's proposal to its logical 
> conclusion, and allow any object to be usable under "with", since it 
> does not create any problems to do so.  (At least, none that I can 
> see.)  A redundant 'with' does no harm; in the worst case it's just a 
> hint to the reader about the scope within which an expression is used 
> within the current fuction body.


Do you mean translating this:

   with EXPR1 as VAR1:
       BLOCK1

To something along the lines of:

   the_stmt = EXPR1
   stmt_enter = getattr(the_stmt, "__enter__", None)
   stmt_exit = getattr(the_stmt, "__exit__", None)

   if stmt_enter is None:
       VAR1 = the_stmt
   else:
       VAR1 = stmt_enter()

   if stmt_exit is None:
       BLOCK1
   else:
       exc = (None, None, None)
       try:
           try:
               BLOCK1
           except:
               exc = sys.exc_info()
               raise
       finally:
           stmt_exit(*exc)


It has a certain elegance - you can switch from using an object that needs 
finalisation to one that doesn't without having to change your code.

And library code can safely ensure finalisation without having to check whether 
the object needs it or not - that check becomes an inherent part of the with 
statement.

I'm ambivalent about this one - I see some benefit to it, but there's something 
niggling at the back of my brain that doesn't like it (nothing I can point to in 
particular, though).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From ncoghlan at gmail.com  Tue May 17 15:05:07 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 17 May 2005 23:05:07 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <79990c6b05051705112dea817e@mail.gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>	
	<5.1.1.6.0.20050513204536.021d5900@mail.telecommunity.com>	
	<ca471dc2050513205855dcba6e@mail.gmail.com>	
	<5.1.1.6.0.20050516132221.01edbeb0@mail.telecommunity.com>	
	<428962F3.70006@canterbury.ac.nz>	
	<ca471dc205051621163cd74301@mail.gmail.com>	
	<4289C08E.4080705@gmail.com>
	<79990c6b05051705112dea817e@mail.gmail.com>
Message-ID: <4289EC03.3000802@gmail.com>

Paul Moore wrote:
> On 5/17/05, Nick Coghlan <ncoghlan at gmail.com> wrote:
>>Previously Belaboured Point? (Just guessing from context here, but if I'm right,
>> that's one acronym I'm going to have to remember. . .)
> 
> Practicality Beats Purity, surely...?

D'oh! *slaps forehead*

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From ncoghlan at gmail.com  Tue May 17 15:08:39 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 17 May 2005 23:08:39 +1000
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4289C2C7.2000508@gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>		<d64imp$fai$1@sea.gmane.org>		<ca471dc205051410435473d2b2@mail.gmail.com>		<4286F8CB.4090001@gmail.com>	<ca471dc205051611542772aa9c@mail.gmail.com>
	<4289C2C7.2000508@gmail.com>
Message-ID: <4289ECD7.8030800@gmail.com>

Nick Coghlan wrote:
> Wouldn't that mean we run the risk of suppressing a *real* SystemExit if it 
> occurs while a generator is being finalised?
> 
> Perhaps a new exception IteratorExit, which is a subclass of SystemExit. Then 
> well-behaved code wouldn't trap it accidentally, and the finalisation code wouldn't?

... inadvertently suppress a real SystemExit.

Cheers,
Nick.
Must have been distracted halfway through that sentence :)

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From aahz at pythoncraft.com  Tue May 17 15:36:17 2005
From: aahz at pythoncraft.com (Aahz)
Date: Tue, 17 May 2005 06:36:17 -0700
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <ca471dc2050516211225b24436@mail.gmail.com>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<20050516214618.GA23741@panix.com>
	<Pine.LNX.4.58.0505161956580.14555@server1.LFW.org>
	<20050517021147.GL20441@performancedrivers.com>
	<ca471dc2050516211225b24436@mail.gmail.com>
Message-ID: <20050517133617.GA2146@panix.com>

On Mon, May 16, 2005, Guido van Rossum wrote:
>
> My rule has more to do with who "owns" the namespace on the one hand,
> and with "magic" behavior caused (or indicated) by the presence of the
> attribute on the other. Class or instance is irrelevant; that most
> magic attributes live on classes or modules is just because those are
> places where most of the magic is concentrated.
> 
> __init__ in a class is a system attribute because it has a magic
> meaning (invoked automatically on instantiation). __file__ and
> __name__ in a module (and __module__ and __name__ in a class!) are
> system attributes because they are "imposing" on the user's use of the
> namespace. (Note: next was a mistake; it should have been __next__
> because of the "magic" rule.)

>From my POV, part of the reasoning should be the extent to which the
attribute is intended to be publicly accessible -- part of the primary
documented interface.  __init__ is magic, fine.  But __name__ isn't part
of the primary use for a class, whereas these new exception attributes
will be part of the public interface for exceptions, just like the
methods for the Queue class.  (I'm using Queue in specific because it's
intended to be subclassed.)
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"And if that makes me an elitist...I couldn't be happier."  --JMS

From gvanrossum at gmail.com  Tue May 17 16:17:44 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 17 May 2005 07:17:44 -0700
Subject: [Python-Dev] Simpler finalization semantics (was Re: PEP 343 -
	Abstract Block Redux)
In-Reply-To: <4289EBAC.1020905@gmail.com>
References: <5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>
	<5.1.1.6.0.20050516144609.028f6cc8@mail.telecommunity.com>
	<4289EBAC.1020905@gmail.com>
Message-ID: <ca471dc205051707175db05aa6@mail.gmail.com>

[Nick Coghlan (replying to Phillip)]
> Do you mean translating this:
> 
>    with EXPR1 as VAR1:
>        BLOCK1
> 
> To something along the lines of:
> 
>    the_stmt = EXPR1
>    stmt_enter = getattr(the_stmt, "__enter__", None)
>    stmt_exit = getattr(the_stmt, "__exit__", None)
> 
>    if stmt_enter is None:
>        VAR1 = the_stmt
>    else:
>        VAR1 = stmt_enter()
> 
>    if stmt_exit is None:
>        BLOCK1
>    else:
>        exc = (None, None, None)
>        try:
>            try:
>                BLOCK1
>            except:
>                exc = sys.exc_info()
>                raise
>        finally:
>            stmt_exit(*exc)

-1.

The compiler must generate both code paths but one is wasted.

I know this is not a very strong argument, but my gut tells me this
generalization of the with-statement is wrong, so I'll stick to it
regardless of the strength of the argument. The real reason will come
to me.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Tue May 17 16:28:18 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 17 May 2005 07:28:18 -0700
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <20050517133617.GA2146@panix.com>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<20050516214618.GA23741@panix.com>
	<Pine.LNX.4.58.0505161956580.14555@server1.LFW.org>
	<20050517021147.GL20441@performancedrivers.com>
	<ca471dc2050516211225b24436@mail.gmail.com>
	<20050517133617.GA2146@panix.com>
Message-ID: <ca471dc205051707283f7db264@mail.gmail.com>

[Guido van Rossum]
> > My rule has more to do with who "owns" the namespace on the one hand,
> > and with "magic" behavior caused (or indicated) by the presence of the
> > attribute on the other. Class or instance is irrelevant; that most
> > magic attributes live on classes or modules is just because those are
> > places where most of the magic is concentrated.
> >
> > __init__ in a class is a system attribute because it has a magic
> > meaning (invoked automatically on instantiation). __file__ and
> > __name__ in a module (and __module__ and __name__ in a class!) are
> > system attributes because they are "imposing" on the user's use of the
> > namespace. (Note: next was a mistake; it should have been __next__
> > because of the "magic" rule.)

[Aahz]
> >From my POV, part of the reasoning should be the extent to which the
> attribute is intended to be publicly accessible -- part of the primary
> documented interface.  __init__ is magic, fine.  But __name__ isn't part
> of the primary use for a class, whereas these new exception attributes
> will be part of the public interface for exceptions, just like the
> methods for the Queue class.  (I'm using Queue in specific because it's
> intended to be subclassed.)

I dunno. Would you find __doc__ not part of the primary, documented
interface of classes? Or __bases__?

The Queue class is irrelevant because the VM doesn't know about it;
*almost* all system atribute are referenced by the VM (or by the
compiler) at some point(*). The reverse is not true, BTW -- many
built-in objects (e.g. tracebacks, code and function objects) have
non-system attributes.

If this were Python 3000, I'd definitely make the traceback and cause
non-system attributes. Given we're retrofitting, I'm less sure; surely
some user-defined exceptions have cause and/or traceback attributes
already.

(*)  I'd say metadata like __version__ is the exception; it's a system
attribute because it's metadata, and because it lives in a crowded
namespace, and because it was retrofitted as a convention.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Tue May 17 16:41:42 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 17 May 2005 07:41:42 -0700
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <ca471dc20505161611497a25b7@mail.gmail.com>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<ca471dc20505161611497a25b7@mail.gmail.com>
Message-ID: <ca471dc20505170741204b8dfe@mail.gmail.com>

I figured out the semantics that I'd like to see intuitively for
setting the context. I'm not saying this is all that reasonable, but
I'd like throw it out anyway to see what responses it gets.

Consider

    try:
        BLOCK
    except EXCEPTION, VAR:
        HANDLER

I'd like to see this translated into

    try:
        BLOCK
    except EXCEPTION, VAR:
        __context = VAR
        try:
            HANDLER
        except Exception, __error:
            __error.__context__ = __context
            raise

i.e. the context is set only upon *leaving* the handler. (The
translation for finally is hairier but you get the idea at this
point.)

My intuition prefers this over Ping's solution because HANDLER could
easily invoke code that (after many stack levels deep) raises and
catches many exceptions, and I'd hate to see all those be bothered by
the context (far down on the stack) that is irrelevant.

BTW, please study how the traceback is built up. I believe that if we
store the traceback in the exception instance, we have to update the
__traceback__ attribute each time we pop a stack level. IIRC that's
how the traceback chain is built up. (The alternative, building the
whole chain  when the exception is raised, would be too expensive for
exceptions raised *and* caught somewhere deep.)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Tue May 17 17:55:46 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 17 May 2005 11:55:46 -0400
Subject: [Python-Dev] Simpler finalization semantics (was Re: PEP 343 -
 Abstract Block Redux)
In-Reply-To: <4289EBAC.1020905@gmail.com>
References: <5.1.1.6.0.20050516144609.028f6cc8@mail.telecommunity.com>
	<5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>
	<5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>
	<5.1.1.6.0.20050516144609.028f6cc8@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050517115316.03d99150@mail.telecommunity.com>

At 11:03 PM 5/17/2005 +1000, Nick Coghlan wrote:
>Do you mean translating this:
>
>    with EXPR1 as VAR1:
>        BLOCK1
>
>To something along the lines of:
>
>    the_stmt = EXPR1
>    stmt_enter = getattr(the_stmt, "__enter__", None)
>    stmt_exit = getattr(the_stmt, "__exit__", None)
>
>    if stmt_enter is None:
>        VAR1 = the_stmt
>    else:
>        VAR1 = stmt_enter()
>
>    if stmt_exit is None:
>        BLOCK1
>    else:
>        exc = (None, None, None)
>        try:
>            try:
>                BLOCK1
>            except:
>                exc = sys.exc_info()
>                raise
>        finally:
>            stmt_exit(*exc)

Essentially, yes; although I was actually suggesting that there be 
PyResource_Enter() and PyResource_Exit() C APIs that would supply the 
default behaviors if the tp_resource_enter and tp_resource_exit slots were 
missing from the object's type.  But that's an implementation detail.


>It has a certain elegance - you can switch from using an object that needs
>finalisation to one that doesn't without having to change your code.
>
>And library code can safely ensure finalisation without having to check 
>whether
>the object needs it or not - that check becomes an inherent part of the with
>statement.

Precisely.


From pje at telecommunity.com  Tue May 17 18:02:15 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 17 May 2005 12:02:15 -0400
Subject: [Python-Dev] Simpler finalization semantics (was Re: PEP 343 -
 Abstract Block Redux)
In-Reply-To: <ca471dc205051707175db05aa6@mail.gmail.com>
References: <4289EBAC.1020905@gmail.com>
	<5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>
	<5.1.1.6.0.20050516144609.028f6cc8@mail.telecommunity.com>
	<4289EBAC.1020905@gmail.com>
Message-ID: <5.1.1.6.0.20050517115642.03d951c8@mail.telecommunity.com>

At 07:17 AM 5/17/2005 -0700, Guido van Rossum wrote:
>The compiler must generate both code paths but one is wasted.

Not if the default behavior is encapsulated in PyResource_Enter() 
(returning the object if no __enter__) and PyResource_Exit() (a no-op if no 
__exit__).  You're going to have to have the branch and test for slot 
presence *somewhere*.


>I know this is not a very strong argument,

It's not even an actual argument.  :)


>but my gut tells me this
>generalization of the with-statement is wrong, so I'll stick to it
>regardless of the strength of the argument. The real reason will come
>to me.

I don't know about anybody else, but I'd prefer to hear that your gut tells 
you it's wrong, because that at least tells me there's no point in 
arguing.  Pseudo-reasons just make me think there's something to discuss.  :)


From pje at telecommunity.com  Tue May 17 18:12:48 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 17 May 2005 12:12:48 -0400
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <ca471dc20505170741204b8dfe@mail.gmail.com>
References: <ca471dc20505161611497a25b7@mail.gmail.com>
	<Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<ca471dc20505161611497a25b7@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050517120438.021b7c50@mail.telecommunity.com>

At 07:41 AM 5/17/2005 -0700, Guido van Rossum wrote:
>Consider
>
>     try:
>         BLOCK
>     except EXCEPTION, VAR:
>         HANDLER
>
>I'd like to see this translated into
>
>     try:
>         BLOCK
>     except EXCEPTION, VAR:
>         __context = VAR
>         try:
>             HANDLER
>         except Exception, __error:
>             __error.__context__ = __context
>             raise
>
>i.e. the context is set only upon *leaving* the handler. (The
>translation for finally is hairier but you get the idea at this
>point.)
>
>My intuition prefers this over Ping's solution because HANDLER could
>easily invoke code that (after many stack levels deep) raises and
>catches many exceptions, and I'd hate to see all those be bothered by
>the context (far down on the stack) that is irrelevant.

This seems intuitively correct to me as well.  If an error occurs in an 
error handler, you want the system to add the context for you.  If you're 
writing a handler that creates a replacement exception, you don't have to 
manually wrap the current exception - the system will add it for you when 
the replacement exception leaves the handler.  In both use cases, you want 
the new exception to be prominent, and the old exception is for digging 
deeper into the matter.  Very nice.

Hm.  What about code like this (made-up) example, though?

     def __getattr__(self,attr):
         try:
             return self.__auxattrs[attr]
         except KeyError:
             raise AttributeError,attr

Under this proposal, IIUC, you would get the context exception for the 
original KeyError, but it's "as designed".  In order to get rid of the 
context traceback, you'd have to write:

     def __getattr__(self,attr):
         try:
             return self.__auxattrs[attr]
         except KeyError:
             pass
         raise AttributeError,attr

Which isn't a big deal, but there's a lot of existing code that follows the 
first form, that would lead to a kind of "traceback spam".


>BTW, please study how the traceback is built up. I believe that if we
>store the traceback in the exception instance, we have to update the
>__traceback__ attribute each time we pop a stack level.

Yes; using .tb_next to build a linked list that goes "towards the exception 
frame".  But at least the context exception doesn't need that, since the 
context traceback can reasonably end at the frame where it was caught.


From eric.nieuwland at xs4all.nl  Tue May 17 19:49:55 2005
From: eric.nieuwland at xs4all.nl (Eric Nieuwland)
Date: Tue, 17 May 2005 19:49:55 +0200
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <ca471dc20505170741204b8dfe@mail.gmail.com>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<ca471dc20505161611497a25b7@mail.gmail.com>
	<ca471dc20505170741204b8dfe@mail.gmail.com>
Message-ID: <94cc3abfd1b8fd2a83e6fa84b9452508@xs4all.nl>

Guido van Rossum wrote:
> Consider
>
>     try:
>         BLOCK
>     except EXCEPTION, VAR:
>         HANDLER
>
> I'd like to see this translated into
>
>     try:
>         BLOCK
>     except EXCEPTION, VAR:
>         __context = VAR
>         try:
>             HANDLER
>         except Exception, __error:
>             __error.__context__ = __context
>             raise

If I interpret the above translation correctly, then:
     try:
         BLOCK1
     except EXCEPTION1, VAR1:
         try:
             BLOCK2
         except EXCEPTION2, VAR2:
             HANDLER

with exceptions occuring in BLOCK1, BLOCK2 and HANDLER would result in 
HANDLER's exception with __context__ set to BLOCK1's exception and 
BLOCK2's exception would be lost.

--eric


From gvanrossum at gmail.com  Tue May 17 20:02:52 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 17 May 2005 11:02:52 -0700
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <94cc3abfd1b8fd2a83e6fa84b9452508@xs4all.nl>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<ca471dc20505161611497a25b7@mail.gmail.com>
	<ca471dc20505170741204b8dfe@mail.gmail.com>
	<94cc3abfd1b8fd2a83e6fa84b9452508@xs4all.nl>
Message-ID: <ca471dc20505171102138355d8@mail.gmail.com>

But that could easily be fixed by appending the context to the end of
the chain, right?

On 5/17/05, Eric Nieuwland <eric.nieuwland at xs4all.nl> wrote:
> Guido van Rossum wrote:
> > Consider
> >
> >     try:
> >         BLOCK
> >     except EXCEPTION, VAR:
> >         HANDLER
> >
> > I'd like to see this translated into
> >
> >     try:
> >         BLOCK
> >     except EXCEPTION, VAR:
> >         __context = VAR
> >         try:
> >             HANDLER
> >         except Exception, __error:
> >             __error.__context__ = __context
> >             raise
> 
> If I interpret the above translation correctly, then:
>      try:
>          BLOCK1
>      except EXCEPTION1, VAR1:
>          try:
>              BLOCK2
>          except EXCEPTION2, VAR2:
>              HANDLER
> 
> with exceptions occuring in BLOCK1, BLOCK2 and HANDLER would result in
> HANDLER's exception with __context__ set to BLOCK1's exception and
> BLOCK2's exception would be lost.
> 
> --eric
> 
> 


-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Tue May 17 20:37:46 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 17 May 2005 11:37:46 -0700
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <4289C2C7.2000508@gmail.com>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<4286F8CB.4090001@gmail.com>
	<ca471dc205051611542772aa9c@mail.gmail.com>
	<4289C2C7.2000508@gmail.com>
Message-ID: <ca471dc205051711375d2b37d7@mail.gmail.com>

[Guido van Rossum]
> > I'm in favor of the general idea, but would like to separate the error
> > injection and finalization API for generators into a separate PEP,
> > which would then compete with PEP 288 and PEP 325.

[Nick Coghlan]
> Without that it pretty much devolves into the current version of PEP 343, though
> (as far as I can tell, the two PEP's now agree on the semantics of with statements)

But that's okay, right? We can just accept both PEPs and we're done.
(And PEP 342 at the same time. :-) Discussing one PEP at a time is
sometimes easier, even if they are meant to be used together.

> > I think the API
> > *should* be made public if it is available internally; I don't see any
> > implementation reasons why it would simplify the implementation if it
> > was only available internally.
> 
> If it's internal, we don't need to name it immediately - working out the public
> API can then be decoupled from the ability to use generators to write resource
> managers.

That's a minor point -- it's not like generators have a namespace for
the user that we don't want to pollute, so we can pick any name we
like.

> > Here are some issues to resolve in such a PEP.
> >
> > - What does _inject_exception() return? It seems that it should raise
> > the exception that was passed into it, but I can't find this written
> > out explicitly.
> 
> That's a good point. The intent was for it to be equivalent to the exception
> being reraised at the point of the last yield. At that point, control flows like
> it would for a call to next() with code inside the generator that looked like:
> 
>    yield
>    exc_type, value, tb = _passed_in_exception()
>    raise exc_type, value, tb

So, also, _inject_exception() will appear to raise the exception that
was passed to it, unless the generator catches it.

> > - What should happen if a generator, in a finally or except clause
> > reached from _inject_exception(), executes another yield? I'm tempted
> > to make this a legitimate outcome (from the generator's perspective)
> > and reserve it for some future statement that implements the looping
> > semantics of PEP 340; the *_template decorator's wrapper class however
> > should consider it an error, just like other protocol mismatches like
> > not yielding the first time or yielding more than once in response to
> > next(). Nick's code in fact does all this right, Ijust think it should
> > be made explicit.
> 
> Yep, that was the intent - you're correct that describing it in the text as well
> would make it clear that this is deliberate.

OK. Please update the PEP then!

> > - TerminateIteration is a lousy name, since "terminate" means about
> > the same as "stop", so there could be legitimate confusion with
> > StopIteration. In PEP 340 I used StopIteration for this purpose, but
> > someone explained this was a poor choice since existing generators may
> > contain code that traps StopIteration for other purposes. Perhaps we
> > could use SystemExit for this purpose? Pretty much everybody is
> > supposed to let this one pass through since its purpose is only to
> > allow cleanup upon program (or thread) exit.
> 
> Wouldn't that mean we run the risk of suppressing a *real* SystemExit if it
> occurs while a generator is being finalised?

D'oh. Yes.

> Perhaps a new exception IteratorExit, which is a subclass of SystemExit. Then
> well-behaved code wouldn't trap [SystemExit] accidentally,
> and the finalisation code wouldn't?

Nah, I think it needs to be a brand spanking new exception. It just
can't be called TerminateIteration. How about GeneratorFinalization to
be utterly clear?

> > - I really don't like reusing __del__ as the method name for any kind
> > of destructor; __del__ has all sorts of special semantics (the GC
> > treats objects with a __del__ method specially).
> 
> I guess if file objects can live without a __del__ method to automatically
> close, generators that require finalisation can survive without it.

Right. At the C level there is of course finalization (the tp_dealloc
slot in the PyTypeObject struct) but it doesn't have a Python entry
point to call it. And that's intentional, since the typical code
executed by that slot *really* destroys the object and must only be
called when the VM is absolutely sure that the object can't be reached
in any other way. if it were callable from Python, that guarantee
would be void.

> > - The all_lines() example jars me. Somehow it bugs me that its caller
> > has to remember to care about finalizing it.
> 
> The alternative is to have some form of automatic finalisation in for loops, or
> else follow up on PJE's idea of making with statements allow pretty much
> *anything* to be used as VAR1.
> 
> Automatic finalisation (like that now described in the Rejected Options section
> of my PEP) makes iterators/generators that manage resources 'just work' (nice)
> but complicates the semantics of generators (bad) and for loops (bad).
> 
> The current version of my PEP means you need to know that a particular
> generator/iterator needs finalisation, and rearrange code to cope with that
> (bad), but keeps for loops simple (nice).

I agree with that so far.

> PJE's idea about a permissive with statement may actually give the best of both
> worlds - you can *always* use the with statement, and if the supplied object
> doesn't need cleaning up, there's no more overhead than checking for the
> existence of methods in a couple of slots. (See my answer to Phillip on that
> topic for the gory details)

But I don't like that solution -- and it's up to you to explain why
(see previous exchange between Phillip & me :-).

> > Maybe it's just not a
> > good example;
> 
> I was having trouble thinking of a case where it made sense for the generator to
> manage its own resources. I agree all_lines isn't a great example, but its a
> safe bet that things like it will be written once the restriction on yielding
> inside try/finally is lifted.

I'd rather drop the example than make one up that's questionable.

It's better to look for potential use cases in existing code than just
look at the proposed construct and ponder "how could I use this..." --
existing code showing a particular idiom/pattern being used repeatedly
shows that there's an actual need.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From mcherm at mcherm.com  Tue May 17 23:42:06 2005
From: mcherm at mcherm.com (Michael Chermside)
Date: Tue, 17 May 2005 14:42:06 -0700
Subject: [Python-Dev] Example for PEP 343
Message-ID: <20050517144206.wmzbalnpias08484@login.werra.lunarpages.com>

In PEP 343 Guido writes:
>    8. Another use for this feature is the Decimal context.  It's left
>       as an exercise for the reader.  (Mail it to me if you'd like to
>       see it here.)

Here are two such examples. Pick your favorite for the PEP.

PS: Writing this helped convince me that allowing the use of generators
instead of classes with the "do_template" decorator is quite nice in
practice, even though it gets confusing (for beginners anyhow) if you
start to think about it too much.

-- Michael Chermside

# ===== SAMPLE #1: increasing precision during a sub-calculation =====

import decimal

@do_template
def with_extra_precision(places=2):
    "Performs nested computation with extra digits of precision."
    decimal.getcontext().prec += 2
    yield None
    decimal.getcontext().prec -= 2



# == SAMPLE USE of #1 ==
# (Sample taken from the Python Library Reference)

def sin(x):
    "Return the sine of x as measured in radians."
    do with_extra_precision():
        i, lasts, s, fact, num, sign = 1, 0, x, 1, x, 1
        while s != lasts:
            lasts = s
            i += 2
            fact *= i * (i-1)
            num *= x * x
            sign *= -1
            s += num / fact * sign
        return +s

# ===== SAMPLE #2: insisting on exact calculations only =====

import decimal

@do_template
def assert_exact():
    "Raises exception if nested computation is not exact."
    ctx = decimal.getcontext()
    was_inexact = ctx.traps[decimal.Inexact]
    ctx.traps[decimal.Inexact] = True
    yield None
    ctx.traps[decimal.Inexact] = was_inexact


# == SAMPLE USE of #2 ==

# Lemma 2 ensures us that each fraction will divide evenly
do assert_exact():
    total = decimal.Decimal(0)
    for n, d in zip(numerators, denominators):
        total += n / d


From pje at telecommunity.com  Wed May 18 00:05:42 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 17 May 2005 18:05:42 -0400
Subject: [Python-Dev] Example for PEP 343
In-Reply-To: <20050517144206.wmzbalnpias08484@login.werra.lunarpages.com
 >
Message-ID: <5.1.1.6.0.20050517180348.01ee6170@mail.telecommunity.com>

At 02:42 PM 5/17/2005 -0700, Michael Chermside wrote:

># ===== SAMPLE #1: increasing precision during a sub-calculation =====
>
>import decimal
>
>@do_template
>def with_extra_precision(places=2):
>     "Performs nested computation with extra digits of precision."
>     decimal.getcontext().prec += 2
>     yield None
>     decimal.getcontext().prec -= 2

Won't this do the wrong thing if something within the block alters the 
precision?


From python at rcn.com  Wed May 18 00:41:03 2005
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 17 May 2005 18:41:03 -0400
Subject: [Python-Dev] Example for PEP 343
In-Reply-To: <5.1.1.6.0.20050517180348.01ee6170@mail.telecommunity.com>
Message-ID: <003901c55b31$83e25320$1206a044@oemcomputer>



> -----Original Message-----
> From: python-dev-bounces+python=rcn.com at python.org [mailto:python-dev-
> bounces+python=rcn.com at python.org] On Behalf Of Phillip J. Eby
> Sent: Tuesday, May 17, 2005 6:06 PM
> To: Michael Chermside; gvanrossum at gmail.com
> Cc: python-dev at python.org
> Subject: Re: [Python-Dev] Example for PEP 343
> 
> At 02:42 PM 5/17/2005 -0700, Michael Chermside wrote:
> 
> ># ===== SAMPLE #1: increasing precision during a sub-calculation
=====
> >
> >import decimal
> >
> >@do_template
> >def with_extra_precision(places=2):
> >     "Performs nested computation with extra digits of precision."
> >     decimal.getcontext().prec += 2
> >     yield None
> >     decimal.getcontext().prec -= 2
> 
> Won't this do the wrong thing if something within the block alters the
> precision?

Right.

It should save, alter, and then restore:

   oldprec = decimal.getcontext().prec
   decimal.getcontext().prec += 2
   yield None
   decimal.getcontext().prec = oldprec


Raymond Hettinger

From gvanrossum at gmail.com  Wed May 18 00:43:52 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 17 May 2005 15:43:52 -0700
Subject: [Python-Dev] Example for PEP 343
In-Reply-To: <003901c55b31$83e25320$1206a044@oemcomputer>
References: <5.1.1.6.0.20050517180348.01ee6170@mail.telecommunity.com>
	<003901c55b31$83e25320$1206a044@oemcomputer>
Message-ID: <ca471dc205051715431226914f@mail.gmail.com>

What's the advantage of using two calls to getcontext() vs. saving the
context in a local variable?

On 5/17/05, Raymond Hettinger <python at rcn.com> wrote:
> 
> 
> > -----Original Message-----
> > From: python-dev-bounces+python=rcn.com at python.org [mailto:python-dev-
> > bounces+python=rcn.com at python.org] On Behalf Of Phillip J. Eby
> > Sent: Tuesday, May 17, 2005 6:06 PM
> > To: Michael Chermside; gvanrossum at gmail.com
> > Cc: python-dev at python.org
> > Subject: Re: [Python-Dev] Example for PEP 343
> >
> > At 02:42 PM 5/17/2005 -0700, Michael Chermside wrote:
> >
> > ># ===== SAMPLE #1: increasing precision during a sub-calculation
> =====
> > >
> > >import decimal
> > >
> > >@do_template
> > >def with_extra_precision(places=2):
> > >     "Performs nested computation with extra digits of precision."
> > >     decimal.getcontext().prec += 2
> > >     yield None
> > >     decimal.getcontext().prec -= 2
> >
> > Won't this do the wrong thing if something within the block alters the
> > precision?
> 
> Right.
> 
> It should save, alter, and then restore:
> 
>    oldprec = decimal.getcontext().prec
>    decimal.getcontext().prec += 2
>    yield None
>    decimal.getcontext().prec = oldprec
> 
> 
> Raymond Hettinger
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: http://mail.python.org/mailman/options/python-dev/guido%40python.org
> 


-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From python at rcn.com  Wed May 18 02:42:07 2005
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 17 May 2005 20:42:07 -0400
Subject: [Python-Dev] Example for PEP 343
In-Reply-To: <ca471dc205051715431226914f@mail.gmail.com>
Message-ID: <000701c55b42$6b825580$b92ac797@oemcomputer>

> What's the advantage of using two calls to getcontext() vs. saving the
> context in a local variable?

I prefer saving the context in a local variable but that is just a
micro-optimization.  The presentation with multiple calls to
getcontext() was kept just to match the style of the original -- the
important change was the absolute save and restore versus the original
relative adjust up and adjust down.


Raymond

From greg.ewing at canterbury.ac.nz  Wed May 18 02:48:26 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 18 May 2005 12:48:26 +1200
Subject: [Python-Dev] Simpler finalization semantics (was Re: PEP 343 -
 Abstract Block Redux)
References: <5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>
	<5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>
	<5.1.1.6.0.20050516144609.028f6cc8@mail.telecommunity.com>
	<4289EBAC.1020905@gmail.com>
Message-ID: <428A90DA.1020804@canterbury.ac.nz>

Nick Coghlan wrote:

>    the_stmt = EXPR1
>    stmt_enter = getattr(the_stmt, "__enter__", None)
>    stmt_exit = getattr(the_stmt, "__exit__", None)
> 
>    if stmt_enter is None:
>        VAR1 = the_stmt
>    else:
>        VAR1 = stmt_enter()

If we're doing this, it might be better if VAR were simply
bound to EXPR in all cases. Otherwise the ability to liberally
sprinkle with-statements around will be hampered by uncertainty
about what kind of object VAR will end up being.

Greg





From gvanrossum at gmail.com  Wed May 18 03:01:04 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 17 May 2005 18:01:04 -0700
Subject: [Python-Dev] Simpler finalization semantics (was Re: PEP 343 -
	Abstract Block Redux)
In-Reply-To: <428A90DA.1020804@canterbury.ac.nz>
References: <5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>
	<5.1.1.6.0.20050516144609.028f6cc8@mail.telecommunity.com>
	<4289EBAC.1020905@gmail.com> <428A90DA.1020804@canterbury.ac.nz>
Message-ID: <ca471dc20505171801457c5053@mail.gmail.com>

And we're back at PEP 310 and you can't really write opening() as a generator.

On 5/17/05, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Nick Coghlan wrote:
> 
> >    the_stmt = EXPR1
> >    stmt_enter = getattr(the_stmt, "__enter__", None)
> >    stmt_exit = getattr(the_stmt, "__exit__", None)
> >
> >    if stmt_enter is None:
> >        VAR1 = the_stmt
> >    else:
> >        VAR1 = stmt_enter()
> 
> If we're doing this, it might be better if VAR were simply
> bound to EXPR in all cases. Otherwise the ability to liberally
> sprinkle with-statements around will be hampered by uncertainty
> about what kind of object VAR will end up being.
> 
> Greg
> 
> 
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: http://mail.python.org/mailman/options/python-dev/guido%40python.org
> 


-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From python at rcn.com  Wed May 18 03:02:58 2005
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 17 May 2005 21:02:58 -0400
Subject: [Python-Dev] Example for PEP 343
In-Reply-To: <000701c55b42$6b825580$b92ac797@oemcomputer>
Message-ID: <000f01c55b45$55484d80$b92ac797@oemcomputer>

> > What's the advantage of using two calls to getcontext() vs. saving
the
> > context in a local variable?
> 
> I also prefer saving the context in a local variable but that is just
a
> micro-optimization.  The presentation with multiple calls to
> getcontext() was kept just to match the style of the original -- the
> important change was the absolute save and restore versus the original
> relative adjust up and adjust down.

One more thought:  Rather than just saving the precision, it is likely
wiser, safer, and more general to just save and restore the whole
context and let the wrapped block only work with a copy.

    oldcontext = decimal.getcontext()
    newcontext = oldcontext.copy()
    newcontext.prec += 2
    yield None
    decimal.setcontext(oldcontext)

This approach defends against various kinds of unruly behavior by the
yield target.



Raymond Hettinger

From greg.ewing at canterbury.ac.nz  Wed May 18 02:55:05 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 18 May 2005 12:55:05 +1200
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<20050516214618.GA23741@panix.com>
	<Pine.LNX.4.58.0505161956580.14555@server1.LFW.org>
	<20050517021147.GL20441@performancedrivers.com>
	<ca471dc2050516211225b24436@mail.gmail.com>
Message-ID: <428A9269.6080805@canterbury.ac.nz>

Guido van Rossum wrote:

> Unfortunately I can't quite decide whether either rule applies in the
> case of exceptions.

I think you'd at least be justified in using the "magic" rule,
since they're set by the exception machinery.

Greg


From pje at telecommunity.com  Wed May 18 03:09:14 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 17 May 2005 21:09:14 -0400
Subject: [Python-Dev] Simpler finalization semantics (was Re: PEP 343 -
 Abstract Block Redux)
In-Reply-To: <428A90DA.1020804@canterbury.ac.nz>
References: <5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>
	<5.1.1.6.0.20050516131826.029c9820@mail.telecommunity.com>
	<5.1.1.6.0.20050516144609.028f6cc8@mail.telecommunity.com>
	<4289EBAC.1020905@gmail.com>
Message-ID: <5.1.1.6.0.20050517210537.03283338@mail.telecommunity.com>

At 12:48 PM 5/18/2005 +1200, Greg Ewing wrote:
>Nick Coghlan wrote:
>
> >    the_stmt = EXPR1
> >    stmt_enter = getattr(the_stmt, "__enter__", None)
> >    stmt_exit = getattr(the_stmt, "__exit__", None)
> >
> >    if stmt_enter is None:
> >        VAR1 = the_stmt
> >    else:
> >        VAR1 = stmt_enter()
>
>If we're doing this, it might be better if VAR were simply
>bound to EXPR in all cases. Otherwise the ability to liberally
>sprinkle with-statements around will be hampered by uncertainty
>about what kind of object VAR will end up being.

It'll be whatever kind of object you should get when you're using EXPR as a 
resource.  :)  In other words, it looks like normal duck typing to me: what 
you get when you access something.quack() is whatever 'something' thinks 
you should get.


From bob at redivi.com  Wed May 18 03:10:52 2005
From: bob at redivi.com (Bob Ippolito)
Date: Tue, 17 May 2005 21:10:52 -0400
Subject: [Python-Dev] Example for PEP 343
In-Reply-To: <000f01c55b45$55484d80$b92ac797@oemcomputer>
References: <000f01c55b45$55484d80$b92ac797@oemcomputer>
Message-ID: <742F28E2-D60A-436E-9CA4-D2DAD13803CD@redivi.com>


On May 17, 2005, at 9:02 PM, Raymond Hettinger wrote:

>>> What's the advantage of using two calls to getcontext() vs. saving
>>>
> the
>
>>> context in a local variable?
>>>
>>
>> I also prefer saving the context in a local variable but that is just
>>
> a
>
>> micro-optimization.  The presentation with multiple calls to
>> getcontext() was kept just to match the style of the original -- the
>> important change was the absolute save and restore versus the  
>> original
>> relative adjust up and adjust down.
>>
>
> One more thought:  Rather than just saving the precision, it is likely
> wiser, safer, and more general to just save and restore the whole
> context and let the wrapped block only work with a copy.
>
>     oldcontext = decimal.getcontext()
>     newcontext = oldcontext.copy()
>     newcontext.prec += 2
>     yield None
>     decimal.setcontext(oldcontext)
>
> This approach defends against various kinds of unruly behavior by the
> yield target.

I think you're missing a decimal.setcontext(newcontext) before the  
yield..

-bob


From tdelaney at avaya.com  Wed May 18 03:19:35 2005
From: tdelaney at avaya.com (Delaney, Timothy C (Timothy))
Date: Wed, 18 May 2005 11:19:35 +1000
Subject: [Python-Dev] Example for PEP 343
Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE02520528@au3010avexu1.global.avaya.com>

Bob Ippolito wrote:

>> One more thought:  Rather than just saving the precision, it is
>> likely wiser, safer, and more general to just save and restore the
>> whole context and let the wrapped block only work with a copy.
>> 
>>     oldcontext = decimal.getcontext()
>>     newcontext = oldcontext.copy()
>>     newcontext.prec += 2
>>     yield None
>>     decimal.setcontext(oldcontext)
>> 
>> This approach defends against various kinds of unruly behavior by the
>> yield target.
> 
> I think you're missing a decimal.setcontext(newcontext) before the
> yield..

Seems to me this should be in the standard library ;)

Tim Delaney

From python at rcn.com  Wed May 18 03:21:17 2005
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 17 May 2005 21:21:17 -0400
Subject: [Python-Dev] Example for PEP 343
In-Reply-To: <742F28E2-D60A-436E-9CA4-D2DAD13803CD@redivi.com>
Message-ID: <001201c55b47$e4811e80$b92ac797@oemcomputer>

> I think you're missing a decimal.setcontext(newcontext) before the
> yield..

Right.

From greg.ewing at canterbury.ac.nz  Wed May 18 03:16:58 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 18 May 2005 13:16:58 +1200
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<5.1.1.6.0.20050513204536.021d5900@mail.telecommunity.com>
	<ca471dc2050513205855dcba6e@mail.gmail.com>
	<5.1.1.6.0.20050516132221.01edbeb0@mail.telecommunity.com>
	<428962F3.70006@canterbury.ac.nz>
	<ca471dc205051621163cd74301@mail.gmail.com>
	<4289C08E.4080705@gmail.com>
	<79990c6b05051705112dea817e@mail.gmail.com>
	<4289EC03.3000802@gmail.com>
Message-ID: <428A978A.5060003@canterbury.ac.nz>

Nick Coghlan wrote:
> Paul Moore wrote:
> 
>>On 5/17/05, Nick Coghlan <ncoghlan at gmail.com> wrote:
>>
>>>Previously Belaboured Point? (Just guessing from context here, but if I'm right,
>>>that's one acronym I'm going to have to remember. . .)
>>
>>Practicality Beats Purity, surely...?
> 
> 
> D'oh! *slaps forehead*
> 
> Cheers,
> Nick.
> 

Hmmm... looks like Google needs a "Search Only in Python
Terminology" radio button...

Greg



From gvanrossum at gmail.com  Wed May 18 04:36:12 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 17 May 2005 19:36:12 -0700
Subject: [Python-Dev] Example for PEP 343
In-Reply-To: <001201c55b47$e4811e80$b92ac797@oemcomputer>
References: <742F28E2-D60A-436E-9CA4-D2DAD13803CD@redivi.com>
	<001201c55b47$e4811e80$b92ac797@oemcomputer>
Message-ID: <ca471dc20505171936425e9b2f@mail.gmail.com>

On 5/17/05, Raymond Hettinger <python at rcn.com> wrote:
> > I think you're missing a decimal.setcontext(newcontext) before the
> > yield..
> 
> Right.

I don't see a call to setcontext() in the sin() example in the library
reference. Is that document wrong? I thought that simply modifying the
parameters of the current context would be sufficient.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From bob at redivi.com  Wed May 18 04:38:25 2005
From: bob at redivi.com (Bob Ippolito)
Date: Tue, 17 May 2005 22:38:25 -0400
Subject: [Python-Dev] Example for PEP 343
In-Reply-To: <ca471dc20505171936425e9b2f@mail.gmail.com>
References: <742F28E2-D60A-436E-9CA4-D2DAD13803CD@redivi.com>
	<001201c55b47$e4811e80$b92ac797@oemcomputer>
	<ca471dc20505171936425e9b2f@mail.gmail.com>
Message-ID: <E810DCD5-6ACF-40BB-BF3D-2AF7BF60B7F0@redivi.com>


On May 17, 2005, at 10:36 PM, Guido van Rossum wrote:

> On 5/17/05, Raymond Hettinger <python at rcn.com> wrote:
>
>>> I think you're missing a decimal.setcontext(newcontext) before the
>>> yield..
>>>
>>
>> Right.
>>
>
> I don't see a call to setcontext() in the sin() example in the library
> reference. Is that document wrong? I thought that simply modifying the
> parameters of the current context would be sufficient.

The library reference isn't modifying the parameters in a *copy* of  
the current context.

-bob


From python at rcn.com  Wed May 18 05:03:38 2005
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 17 May 2005 23:03:38 -0400
Subject: [Python-Dev] Example for PEP 343
In-Reply-To: <ca471dc20505171936425e9b2f@mail.gmail.com>
Message-ID: <000001c55b56$30b96a60$ab29a044@oemcomputer>

> I don't see a call to setcontext() in the sin() example in the library
> reference. Is that document wrong? I thought that simply modifying the
> parameters of the current context would be sufficient.

The sin() example is correct.  The precision is changed and restored in
the current context.

However, for a general purpose wrapper, it is preferable to make a
context copy and then restore the context after the enclosed is run.
That guards against the enclosed block making any unexpected context
changes.

Also, since the wrapper is intended to work like a try/finally, it will
make sure the context gets restored even if an exception is raised at
some unexpected point in the middle of the computation.



Raymond 

From gvanrossum at gmail.com  Wed May 18 05:39:35 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 17 May 2005 20:39:35 -0700
Subject: [Python-Dev] Example for PEP 343
In-Reply-To: <000001c55b56$30b96a60$ab29a044@oemcomputer>
References: <ca471dc20505171936425e9b2f@mail.gmail.com>
	<000001c55b56$30b96a60$ab29a044@oemcomputer>
Message-ID: <ca471dc205051720391c1a681c@mail.gmail.com>

[Raymond Hettinger]
> The sin() example is correct.  The precision is changed and restored in
> the current context.

I got that eventually. :-)

> However, for a general purpose wrapper, it is preferable to make a
> context copy and then restore the context after the enclosed is run.
> That guards against the enclosed block making any unexpected context
> changes.

(Although if people get in the habit of using the provided wrappers
and the do-statement, there won't be any unexpected changes.)

> Also, since the wrapper is intended to work like a try/finally, it will
> make sure the context gets restored even if an exception is raised at
> some unexpected point in the middle of the computation.

Yes, that's the point of the do-statement. :-

Anyway, perhaps we should provide this most general template:

  @do_template
  def with_decimal_context():
      oldctx = decimal.getcontext()
      newctx = oldctx.copy()
      decimal.setcontext(newctx)
      yield newctx
      decimal.setcontext(oldctx)

To be used like this:

  do with_decimal_context() as ctx:
      ctx.prec += 2
      # change other settings
      # algorithm goes here

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From bob at redivi.com  Wed May 18 05:45:32 2005
From: bob at redivi.com (Bob Ippolito)
Date: Tue, 17 May 2005 23:45:32 -0400
Subject: [Python-Dev] Example for PEP 343
In-Reply-To: <ca471dc205051720391c1a681c@mail.gmail.com>
References: <ca471dc20505171936425e9b2f@mail.gmail.com>
	<000001c55b56$30b96a60$ab29a044@oemcomputer>
	<ca471dc205051720391c1a681c@mail.gmail.com>
Message-ID: <CD4FDAB4-0340-4C6E-A00A-88B9A754EC05@redivi.com>


On May 17, 2005, at 11:39 PM, Guido van Rossum wrote:

> [Raymond Hettinger]
>
>> However, for a general purpose wrapper, it is preferable to make a
>> context copy and then restore the context after the enclosed is run.
>> That guards against the enclosed block making any unexpected context
>> changes.
>>
>
> (Although if people get in the habit of using the provided wrappers
> and the do-statement, there won't be any unexpected changes.)
>
>
>> Also, since the wrapper is intended to work like a try/finally, it  
>> will
>> make sure the context gets restored even if an exception is raised at
>> some unexpected point in the middle of the computation.
>>
>
> Yes, that's the point of the do-statement. :-
>
> Anyway, perhaps we should provide this most general template:
>
>   @do_template
>   def with_decimal_context():
>       oldctx = decimal.getcontext()
>       newctx = oldctx.copy()
>       decimal.setcontext(newctx)
>       yield newctx
>       decimal.setcontext(oldctx)
>
> To be used like this:
>
>   do with_decimal_context() as ctx:
>       ctx.prec += 2
>       # change other settings
>       # algorithm goes here

I have yet to use the decimal module much, so I may be completely off  
here.. but why not write it like this:

@do_template
def with_decimal_context():
     curctx = decimal.getcontext()
     oldctx = curctx.copy()
     yield curctx
     decimal.setcontext(oldctx)

Saves a line and a context set :)

-bob


From python at rcn.com  Wed May 18 05:47:39 2005
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 17 May 2005 23:47:39 -0400
Subject: [Python-Dev] [Python-checkins] python/nondist/peps pep-0343.txt,
	1.11, 1.12
In-Reply-To: <E1DYALw-0001Rd-RB@sc8-pr-cvs1.sourceforge.net>
Message-ID: <000101c55b5c$56f14b20$ab29a044@oemcomputer>

> +        def sin(x):
> +            "Return the sine of x as measured in radians."
> +            do with_extra_precision():
> +                i, lasts, s, fact, num, sign = 1, 0, x, 1, x, 1
> +                while s != lasts:
> +                    lasts = s
> +                    i += 2
> +                    fact *= i * (i-1)
> +                    num *= x * x
> +                    sign *= -1
> +                    s += num / fact * sign
> +                return +s

One more change:  The final "return +s" should be unindented.  It should
be at the same level as the "do with_extra_precision()".  The purpose of
the "+s" is to force the result to be rounded back to the *original*
precision.

This nuance is likely to be the bane of folks who shift back and forth
between different levels of precision.  The following example shows the
kind of oddity that can arise when working with quantities that have not
been rounded to the current precision:

>>> from decimal import getcontext, Decimal as D
>>> getcontext().prec = 3
>>> D('3.104') + D('2.104')
Decimal("5.21")
>>> D('3.104') + D('0.000') + D('2.104')
Decimal("5.20")



Raymond

From kbk at shore.net  Wed May 18 06:12:36 2005
From: kbk at shore.net (Kurt B. Kaiser)
Date: Wed, 18 May 2005 00:12:36 -0400 (EDT)
Subject: [Python-Dev] Weekly Python Patch/Bug Summary
Message-ID: <200505180412.j4I4CaB1028858@bayview.thirdcreek.com>

Patch / Bug Summary
___________________

Patches :  339 open ( +7) /  2838 closed ( +4) /  3177 total (+11)
Bugs    :  938 open (+11) /  4962 closed ( +3) /  5900 total (+14)
RFE     :  187 open ( +1) /   157 closed ( +0) /   344 total ( +1)

New / Reopened Patches
______________________

Restore GC support to set objects  (2005-05-11)
       http://python.org/sf/1200018  opened by  Barry A. Warsaw

idlelib patch  (2005-05-11)
       http://python.org/sf/1200038  opened by  sowjanya

Small optimization for PyDict_Merge()  (2005-05-11)
CLOSED http://python.org/sf/1200051  opened by  Barry A. Warsaw

buffer overflow in _cursesmodule.c  (2005-05-11)
       http://python.org/sf/1200134  opened by  Jan Michael H?lsbergen

patch IDLE to allow running anonymous code in editor window  (2005-05-13)
       http://python.org/sf/1201522  opened by  Jeff Shute

allow running multiple instances of IDLE  (2005-05-13)
       http://python.org/sf/1201569  opened by  Jeff Shute

httplib mentions getreply instead of getresponse  (2005-05-16)
       http://python.org/sf/1203094  opened by  Robert Brewer

workaround deprecated ostat structure in <sys/stat.h>  (2005-05-17)
       http://python.org/sf/1203329  opened by  J. J. Snitow

Patch for [ 1163563 ] Sub threads execute in restricted mode  (2005-05-17)
       http://python.org/sf/1203393  opened by  anothermax

Allow larger programs to be frozen under Win32  (2005-05-17)
       http://python.org/sf/1203650  opened by  Gottfried Gan?auge

Patches Closed
______________

Small optimization for PyDict_Merge()  (2005-05-11)
       http://python.org/sf/1200051  closed by  rhettinger

better datetime support for xmlrpclib  (2005-02-10)
       http://python.org/sf/1120353  closed by  montanaro

Add O_SHLOCK/O_EXLOCK to posix  (2005-01-17)
       http://python.org/sf/1103951  closed by  montanaro

Fix _tryorder in webbrowser.py  (2005-03-20)
       http://python.org/sf/1166780  closed by  rodsenra

New / Reopened Bugs
___________________

installation problem with python 2.4.1 on Win2k system  (2005-05-11)
       http://python.org/sf/1199808  opened by  mmkobayashi

Python 2.4.1 Installer ended prematurely  (2005-05-11)
       http://python.org/sf/1199947  opened by  Wai Yip Tung

time module ignores timezone changes  (2005-05-09)
CLOSED http://python.org/sf/1198275  reopened by  bcannon

Windows msi installer fails on virtual drives  (2005-05-11)
       http://python.org/sf/1200287  opened by  bartgrantham

SyntaxError raised on win32 for correct files  (2005-05-12)
       http://python.org/sf/1200686  opened by  Federico Di Gregorio

Wrong word on "raise" page  (2005-05-13)
CLOSED http://python.org/sf/1201438  opened by  Erik Rose

Problem with recursion in dict (crash with core dump)  (2005-05-13)
       http://python.org/sf/1201456  opened by  Vladimir Yu. Stepanov

suspected cPickle memory leak   (2005-05-13)
       http://python.org/sf/1201461  opened by  Alan

Glossary listing bug  (2005-05-14)
CLOSED http://python.org/sf/1201807  opened by  George Yoshida

mimetypes.py does not find mime.types on Mac OS X  (2005-05-14)
       http://python.org/sf/1202018  opened by  Stefan H. Holek

Description of string.lstrip() needs improvement  (2005-05-15)
       http://python.org/sf/1202395  opened by  Roy Smith

httplib docs mentioning HTTPConnection.getreply  (2005-05-15)
       http://python.org/sf/1202475  opened by  Georg Brandl

RE parser too loose with {m,n} construct  (2005-05-15)
       http://python.org/sf/1202493  opened by  Skip Montanaro

a bunch of infinite C recursions  (2005-05-15)
       http://python.org/sf/1202533  opened by  Armin Rigo

Problem with abs function  (2005-05-16)
       http://python.org/sf/1202946  opened by  ric-b

Bugs Closed
___________

time module ignores timezone changes  (2005-05-09)
       http://python.org/sf/1198275  closed by  bcannon

SystemError: error return without exception set  (2005-05-05)
       http://python.org/sf/1195984  closed by  nbajpai

Wrong word on "raise" page  (2005-05-13)
       http://python.org/sf/1201438  closed by  rhettinger

Glossary listing bug  (2005-05-14)
       http://python.org/sf/1201807  closed by  rhettinger

New / Reopened RFE
__________________

enhancing os.chown functionality  (2005-05-12)
       http://python.org/sf/1200804  opened by  gyrof


From tim.peters at gmail.com  Wed May 18 06:28:18 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Wed, 18 May 2005 00:28:18 -0400
Subject: [Python-Dev] [Python-checkins] python/nondist/peps pep-0343.txt,
	1.11, 1.12
In-Reply-To: <000101c55b5c$56f14b20$ab29a044@oemcomputer>
References: <E1DYALw-0001Rd-RB@sc8-pr-cvs1.sourceforge.net>
	<000101c55b5c$56f14b20$ab29a044@oemcomputer>
Message-ID: <1f7befae0505172128f1f9daa@mail.gmail.com>

[Raymond Hettinger]
> ...
> One more change:  The final "return +s" should be unindented.  It should
> be at the same level as the "do with_extra_precision()".  The purpose of
> the "+s" is to force the result to be rounded back to the *original*
> precision.
>
> This nuance is likely to be the bane of folks who shift back and forth
> between different levels of precision.

Well, a typical user will never change precision most of the time.  Of
the remaining uses, most will set precision once at the start of the
program, and never change it again.  Library authors may change
precision frequently, but they should be experts.

> The following example shows the kind of oddity that can arise when
> working with quantities that have not been rounded to the current precision:
>
> >>> from decimal import getcontext, Decimal as D
> >>> getcontext().prec = 3
> >>> D('3.104') + D('2.104')
> Decimal("5.21")
> >>> D('3.104') + D('0.000') + D('2.104')
> Decimal("5.20")

I think it shows more why it was a mistake for the decimal constructor
to extend the standard (the string->decimal operation in the standard
respects context settings; the results differ here because D(whatever)
ignores context settings; having a common operation ignore context is
ugly and error-prone).

From reinhold-birkenfeld-nospam at wolke7.net  Wed May 18 07:24:00 2005
From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld)
Date: Wed, 18 May 2005 07:24:00 +0200
Subject: [Python-Dev] Request for dev permissions
Message-ID: <d6ejg7$gic$1@sea.gmane.org>

Hello,

would anybody mind if I was given permissions on the tracker and CVS, for fixing small
things like bug #1202475. I feel that I can help you others out a bit with this and
I promise I won't change the interpreter to accept braces...

Reinhold

-- 
Mail address is perfectly valid!


From fumanchu at amor.org  Wed May 18 07:35:40 2005
From: fumanchu at amor.org (Robert Brewer)
Date: Tue, 17 May 2005 22:35:40 -0700
Subject: [Python-Dev] Request for dev permissions
Message-ID: <3A81C87DC164034AA4E2DDFE11D258E37720DA@exchange.hqamor.amorhq.net>

Reinhold Birkenfeld wrote:
> would anybody mind if I was given permissions on the tracker 
> and CVS, for fixing small
> things like bug #1202475. I feel that I can help you others 
> out a bit with this and
> I promise I won't change the interpreter to accept braces...


I made a patch for that one the next day, by the way. #1203094


Robert Brewer
System Architect
Amor Ministries
fumanchu at amor.org

P.S. Do you have a valid email address, RB? I wasn't able to fix up your
nospam address by hand.

From python at rcn.com  Wed May 18 08:03:07 2005
From: python at rcn.com (Raymond Hettinger)
Date: Wed, 18 May 2005 02:03:07 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <1f7befae0505172128f1f9daa@mail.gmail.com>
Message-ID: <000601c55b6f$43231fc0$ab29a044@oemcomputer>

[Raymond]
> > The following example shows the kind of oddity that can arise when
> > working with quantities that have not been rounded to the current
> precision:
> >
> > >>> from decimal import getcontext, Decimal as D
> > >>> getcontext().prec = 3
> > >>> D('3.104') + D('2.104')
> > Decimal("5.21")
> > >>> D('3.104') + D('0.000') + D('2.104')
> > Decimal("5.20")

[Tim]
> I think it shows more why it was a mistake for the decimal constructor
> to extend the standard (the string->decimal operation in the standard
> respects context settings; the results differ here because D(whatever)
> ignores context settings;

For brevity, the above example used the context free constructor, but
the point was to show the consequence of a precision change.  That
oddity occurs even in the absence of a call to the Decimal constructor.
For instance, using the context aware constructor,
Context.create_decimal(), we get the same result when switching
precision:


>>> from decimal import getcontext
>>> context = getcontext()
>>> x = context.create_decimal('3.104')
>>> y = context.create_decimal('2.104')
>>> z = context.create_decimal('0.000')
>>> context.prec = 3
>>> x + y
Decimal("5.21")
>>> x + z + y
Decimal("5.20")

The whole point of the unary plus operation in the decimal module is to
force a rounding using the current context.  This needs to be a standard
practice whenever someone is changing precision in midstream.  Most
folks won't (or shouldn't) be doing that, but those who do (as they
would in the PEP's use case) need a unary plus after switching
precision.

As for why the normal Decimal constructor is context free, PEP 327
indicates discussion on the subject, but who made the decision and why
is not clear. 



Raymond

From ncoghlan at gmail.com  Wed May 18 10:51:51 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 18 May 2005 18:51:51 +1000
Subject: [Python-Dev] Example for PEP 343
In-Reply-To: <ca471dc205051720391c1a681c@mail.gmail.com>
References: <ca471dc20505171936425e9b2f@mail.gmail.com>	<000001c55b56$30b96a60$ab29a044@oemcomputer>
	<ca471dc205051720391c1a681c@mail.gmail.com>
Message-ID: <428B0227.5080907@gmail.com>

Guido van Rossum wrote:
> Anyway, perhaps we should provide this most general template:
> 
>   @do_template
>   def with_decimal_context():
>       oldctx = decimal.getcontext()
>       newctx = oldctx.copy()
>       decimal.setcontext(newctx)
>       yield newctx
>       decimal.setcontext(oldctx)
> 
> To be used like this:
> 
>   do with_decimal_context() as ctx:
>       ctx.prec += 2
>       # change other settings
>       # algorithm goes here


For the 'with' keyword, and the appropriate __enter__/__exit__ methods on 
decimal Contexts, this can be written:

   with decimal.getcontext() as ctx:
       ctx.prec += 2
       # change other settings
       # algorithm goes here
   # Pre-with context guaranteed to be restored here

The decimal.Context methods to make this work:

   def __enter__(self):
       current = getcontext()
       if current is self:
           self._saved_context = self.copy()
       else:
           self._saved_context = current
           setcontext(self)

   def __exit___(self, *exc_info):
       if self._saved_context is None:
           raise RuntimeError("No context saved")
       try:
           setcontext(self._saved_context)
       finally:
           self._saved_context = None

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From skip at pobox.com  Wed May 18 15:05:05 2005
From: skip at pobox.com (Skip Montanaro)
Date: Wed, 18 May 2005 08:05:05 -0500
Subject: [Python-Dev] Request for dev permissions
In-Reply-To: <3A81C87DC164034AA4E2DDFE11D258E37720DA@exchange.hqamor.amorhq.net>
References: <3A81C87DC164034AA4E2DDFE11D258E37720DA@exchange.hqamor.amorhq.net>
Message-ID: <17035.15745.833336.44211@montanaro.dyndns.org>


    Robert> P.S. Do you have a valid email address, RB? I wasn't able to fix
    Robert> up your nospam address by hand.

That's because it didn't need fixing... Note Reinhold's sig:

    Reinhold> -- 
    Reinhold> Mail address is perfectly valid!

<wink>

Skip

From aahz at pythoncraft.com  Wed May 18 15:05:55 2005
From: aahz at pythoncraft.com (Aahz)
Date: Wed, 18 May 2005 06:05:55 -0700
Subject: [Python-Dev] Decimal construction
In-Reply-To: <1f7befae0505172128f1f9daa@mail.gmail.com>
References: <E1DYALw-0001Rd-RB@sc8-pr-cvs1.sourceforge.net>
	<000101c55b5c$56f14b20$ab29a044@oemcomputer>
	<1f7befae0505172128f1f9daa@mail.gmail.com>
Message-ID: <20050518130555.GA9515@panix.com>

On Wed, May 18, 2005, Tim Peters wrote:
>
> I think it shows more why it was a mistake for the decimal constructor
> to extend the standard (the string->decimal operation in the standard
> respects context settings; the results differ here because D(whatever)
> ignores context settings; having a common operation ignore context is
> ugly and error-prone).

Not sure what the "right" answer is, but I wanted to stick my oar in to
say that I think that Decimal has not been in the field long enough or
widely-enough used that we should feel that the API has been set in
stone.  If there's agreement that a mistake was made, let's fix it!
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"And if that makes me an elitist...I couldn't be happier."  --JMS

From gvanrossum at gmail.com  Wed May 18 17:48:19 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Wed, 18 May 2005 08:48:19 -0700
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <428A9269.6080805@canterbury.ac.nz>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<20050516214618.GA23741@panix.com>
	<Pine.LNX.4.58.0505161956580.14555@server1.LFW.org>
	<20050517021147.GL20441@performancedrivers.com>
	<ca471dc2050516211225b24436@mail.gmail.com>
	<428A9269.6080805@canterbury.ac.nz>
Message-ID: <ca471dc205051808487bc3e875@mail.gmail.com>

Here's another rule-of-thumb: when the VM and the user *share* the
attribute space of an object, the VM uses system attributes; the VM
uses plain attributes for objects that it owns completely (like code
objects, frames and so on, which rarely figure user code except for
the explicit purpose of introspection). So I think the PEP should
continue to use __traceback__ etc.

On 5/17/05, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Guido van Rossum wrote:
> 
> > Unfortunately I can't quite decide whether either rule applies in the
> > case of exceptions.
> 
> I think you'd at least be justified in using the "magic" rule,
> since they're set by the exception machinery.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From mcherm at mcherm.com  Wed May 18 18:12:13 2005
From: mcherm at mcherm.com (Michael Chermside)
Date: Wed, 18 May 2005 09:12:13 -0700
Subject: [Python-Dev] Example for PEP 343
Message-ID: <20050518091213.2jysu4iyy144804g@login.werra.lunarpages.com>

Phillip writes:
> >@do_template
> >def with_extra_precision(places=2):
> >     "Performs nested computation with extra digits of precision."
> >     decimal.getcontext().prec += 2
> >     yield None
> >     decimal.getcontext().prec -= 2
>
> Won't this do the wrong thing if something within the block alters
> the precision?

Depends on what behavior you want. I wrote it this way partly because
that's how the example in the manual was written, and partly because
I was thinking "if someone increased the precision by 3 within the
block, then we probably want to leave it increased by 3 on block
exit".

On careful re-consideration, I think it'd be better to require that
the block NOT make unbalanced changes to the precision... we
could verify it using an assert statement.

I avoided caching the context and restoring it, because I WANTED to
allow code in the block to make OTHER alterations to the context and
not clobber them after the block exits (eg: setting or clearing some
trap). It's debatable whether that was a good design choice... there's
a great deal of elegence to Guido's version used like this:

Guido:
>   do with_decimal_context() as ctx:
>       ctx.prec += 2
>       # change other settings
>       # algorithm goes here

However, all of these are minor details compared to the bug that
Raymond points out:

Raymond:
> The final "return +s" should be unindented.  It should
> be at the same level as the "do with_extra_precision()".  The purpose of
> the "+s" is to force the result to be rounded back to the *original*
> precision.

In effect, the "with_extra_precision" wrapper causes the calculations
to be done with higher precision, AND causes any variables set during
the block will retain their higher precision. (It's because context
controls OPERATIONS but changing context never affects individual
Decimal OBJECTS.) So I fear that the whole with_extra_precision()
idea is just likely to tempt people into introducing subtle bugs, and
it should probably be scrapped anyhow. Guido's approach to save-and-
restore context works fine.

-- Michael Chermside

(PS: Another reason that I avoided a basic save-and-restore is that we
have plenty of examples already of 'do' statements that save-and-restore,
I was trying to do something different. It's interesting that what I
tried may have turned out to be a poor idea.)


From gvanrossum at gmail.com  Wed May 18 18:39:11 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Wed, 18 May 2005 09:39:11 -0700
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325: generator
	exceptions and cleanup
Message-ID: <ca471dc2050518093929da936c@mail.gmail.com>

I believe that in the discussion about PEP 343 vs. Nick's PEP 3XX
(http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html, still
awaiting PEP moderator approval I believe?) the main difference is
that Nick proposes a way to inject an exception into a generator; and
I've said a few times that I like that idea.

I'd like to propose to make that a separate PEP, which can combine
elements of PEP 288 and PEP 325. Summary:

- g.throw(type, value, traceback) causes the specified exception to be
thrown at the place where the generator g is currently suspended. If
the generator catches the exception and yields another value, that is
the return value of g.throw(). If it doesn't catch the exception, the
throw() appears to raise the same exception passed it (it "falls
through"). If the generator raises another exception (this includes
the StopIteration produced when it returns) that exception is raised
by the throw. In summary, throw() behaves like next() except it raises
an exception at the place of the yield. If the generator is already in
the closed state, throw() just raises the exception it was passed
without going through the generator.

- There's a new exception, GeneratorExit, which can be thrown to cause
a generator to clean up. A generator should not yield a value in
response to this exception.

- g.close() throws a GeneratorExit exception in the generator, and
catches it (so g.close() itself does not raise an exception).
g.close() is idempotent -- if the generator is already closed, it is a
no-op. If the generator, against the rules, yields another value, it
is nevertheless marked closed.

- When a generator is GC'ed, its close() method is called (which is a
no-op if it is already closed).

That's it! With this, we can write the decorator from Nick's PEP 3XX
and the generator examples in PEP 343 can be rewritten to have a
try/finally clause around the yield statement.

Oh, somewhere it should be stated that yield without an expression is
equivalent to yield None. PEP 342 ("continue EXPR") already implies
that, so we don't have to write a separate PEP for it. I also propose
to go with the alternative in PEP 342 of using next() rather than
__next__() -- generators will have methods next(), throw(), and
close().

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Wed May 18 18:55:49 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Wed, 18 May 2005 12:55:49 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
 generator exceptions and cleanup
In-Reply-To: <ca471dc2050518093929da936c@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050518124801.01eea0f0@mail.telecommunity.com>

At 09:39 AM 5/18/2005 -0700, Guido van Rossum wrote:
>- g.throw(type, value, traceback) causes the specified exception to be
>thrown at the place where the generator g is currently suspended. If
>the generator catches the exception and yields another value, that is
>the return value of g.throw(). If it doesn't catch the exception, the
>throw() appears to raise the same exception passed it (it "falls
>through"). If the generator raises another exception (this includes
>the StopIteration produced when it returns) that exception is raised
>by the throw. In summary, throw() behaves like next() except it raises
>an exception at the place of the yield. If the generator is already in
>the closed state, throw() just raises the exception it was passed
>without going through the generator.
>
>- There's a new exception, GeneratorExit, which can be thrown to cause
>a generator to clean up. A generator should not yield a value in
>response to this exception.
>
>- g.close() throws a GeneratorExit exception in the generator, and
>catches it (so g.close() itself does not raise an exception).
>g.close() is idempotent -- if the generator is already closed, it is a
>no-op. If the generator, against the rules, yields another value, it
>is nevertheless marked closed.
>
>- When a generator is GC'ed, its close() method is called (which is a
>no-op if it is already closed).
>
>That's it! With this, we can write the decorator from Nick's PEP 3XX
>and the generator examples in PEP 343 can be rewritten to have a
>try/finally clause around the yield statement.
>
>Oh, somewhere it should be stated that yield without an expression is
>equivalent to yield None. PEP 342 ("continue EXPR") already implies
>that, so we don't have to write a separate PEP for it. I also propose
>to go with the alternative in PEP 342 of using next() rather than
>__next__() -- generators will have methods next(), throw(), and
>close().

And there was much rejoicing in the land of the co-routiney people.  :)  +1000.

Should this maybe just be added to PEP 342?  To me, PEP 342 has always 
seemed incomplete without ways to throw() and close(), but that could 
easily be just me.  In any case I'd expect the implementation of 
'next(arg)' to have some overlap with the implementation of 'throw()'.

Also, if the generator yields a value upon close(), shouldn't that throw a 
runtime error?  Otherwise, you have no way to know the generator's 
exception handling is broken.


From Michaels at rd.bbc.co.uk  Wed May 18 18:36:59 2005
From: Michaels at rd.bbc.co.uk (Michael Sparks)
Date: Wed, 18 May 2005 17:36:59 +0100
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
	generator exceptions and cleanup
In-Reply-To: <ca471dc2050518093929da936c@mail.gmail.com>
References: <ca471dc2050518093929da936c@mail.gmail.com>
Message-ID: <200505181736.59532.Michaels@rd.bbc.co.uk>

On Wednesday 18 May 2005 17:39, Guido van Rossum wrote:
> I believe that in the discussion about PEP 343 vs. Nick's PEP 3XX
> (http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html, still
> awaiting PEP moderator approval I believe?) the main difference is
> that Nick proposes a way to inject an exception into a generator; and
> I've said a few times that I like that idea.
>
> I'd like to propose to make that a separate PEP, which can combine
> elements of PEP 288 and PEP 325. Summary:
>
> - g.throw(type, value, traceback) causes the specified exception to be
> thrown at the place where the generator g is currently suspended. 
..
> - There's a new exception, GeneratorExit, which can be thrown to cause
> a generator to clean up. A generator should not yield a value in
> response to this exception.
>
> - g.close() throws a GeneratorExit exception in the generator, and
> catches it (so g.close() itself does not raise an exception).
> g.close() is idempotent -- if the generator is already closed, it is a
> no-op. If the generator, against the rules, yields another value, it
> is nevertheless marked closed.

+1

We're currently using python generators to handle concurrency in a single 
thread, and this allows a simple scheduler to have a clean way of sending 
generators a generator shutdown message. (Currently we have to do it
another way that is specific to the style of wrapped generator)

If you need a volunteer to code this - should it go through, I'm willing to 
have a go at this. (I can't do this though for 3 weeks or so though due to a 
crunch at work, and I might be in over my head in offering this.)


Michael.
-- 
Michael Sparks, Senior R&D Engineer, Digital Media Group
Michael.Sparks at rd.bbc.co.uk, http://kamaelia.sourceforge.net/
British Broadcasting Corporation, Research and Development
Kingswood Warren, Surrey KT20 6NP

This e-mail may contain personal views which are not the views of the BBC.

From gvanrossum at gmail.com  Wed May 18 18:55:12 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Wed, 18 May 2005 09:55:12 -0700
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
	generator exceptions and cleanup
In-Reply-To: <5.1.1.6.0.20050518124801.01eea0f0@mail.telecommunity.com>
References: <ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050518124801.01eea0f0@mail.telecommunity.com>
Message-ID: <ca471dc2050518095515aadde9@mail.gmail.com>

[Phillip J. Eby]
> And there was much rejoicing in the land of the co-routiney people.  :)  +1000.
> 
> Should this maybe just be added to PEP 342?  To me, PEP 342 has always
> seemed incomplete without ways to throw() and close(), but that could
> easily be just me.  In any case I'd expect the implementation of
> 'next(arg)' to have some overlap with the implementation of 'throw()'.

Maybe, but on the other hand this idea can be done independently from
PEP 342. After the "monster-PEP" 340, I'd rather break proposals up in
small parts.

> Also, if the generator yields a value upon close(), shouldn't that throw a
> runtime error?  Otherwise, you have no way to know the generator's
> exception handling is broken.

Maybe. But then what should happen when this happens to close()
invoked by the GC? I guess the same as when a __del__() method raises
an exception -- print a traceback and go on. OK, works for me.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From mcherm at mcherm.com  Wed May 18 19:02:25 2005
From: mcherm at mcherm.com (Michael Chermside)
Date: Wed, 18 May 2005 10:02:25 -0700
Subject: [Python-Dev] Simpler finalization semantics (was Re: PEP
	343	-Abstract Block Redux)
Message-ID: <20050518100225.uh0ktbqt4dfs4kgk@login.werra.lunarpages.com>

[I apologize in advance if this sounds a bit disjointed... I started
to argue one thing, but by the end had convinced myself of the
opposite, and I re-wrote the email to match my final conclusion.]

Guido writes:
> About deleting VAR I have mixed feelings. [...]
> I think that, given that we let the for-loop variable survive, we
> should treat the with-statement variable the same way.

We said the same thing about the variable in list comprehensions
and it's now obvious that it should NEVER have been allowed to escape
it's scope. But the key difference is that list comprehensions are
EXPRESSIONS, and for and 'with' are STATEMENTS. Expressions shouldn't
modify the local environment, statements often do.

Of course, that argument _permits_ not deleting VAR, but doesn't
recomend in favor of it.

My first thought for ideal behavior was that if VAR was previously
defined (eg: a global, an existing attribute of some object, etc),
then it should not be 'del''ed afterward. But VAR was newly created
by the 'with' statement then we WOULD del it to keep the namespace
"neat". Trouble is, that's FAR too complex, and depends on a
distinction Python has not used before (although it's nearly the
same as the property that controls the meaning of globally declared
variables).

My next thought was to just allow 'with' statements to introduce
their own "scope"... the meaning that the VAR variable takes on within
a 'with' statement is not propogated outside the scope of the statement.
But imagine trying to implement this in CPython... don't forget details
like supporting locals(). If it's too hard to do, then it's probably
not the right solution.

So then I thought "Well, what's the harm in letting the variable
survive the 'with' statement?" I'm a big fan of keeping namespaces
"clean", but it's just not important enough to incurr other penalties.
So in this case, I (reluctantly, after giving myself quite a talking-to)
favor having the 'with' statement with VAR create said variable in the
appropriate scope as a side-effect, much like 'for'.

-- Michael Chermside


From mwh at python.net  Mon May 16 11:38:37 2005
From: mwh at python.net (Michael Hudson)
Date: Mon, 16 May 2005 10:38:37 +0100
Subject: [Python-Dev] PEP 343 - Abstract Block Redux
In-Reply-To: <42883C05.705@canterbury.ac.nz> (Greg Ewing's message of "Mon,
	16 May 2005 18:21:57 +1200")
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<42883C05.705@canterbury.ac.nz>
Message-ID: <2moebb4ijm.fsf@starship.python.net>

Greg Ewing <greg.ewing at canterbury.ac.nz> writes:

> Guido van Rossum wrote:
>
>> PEP 340 is still my favorite, but it seems there's too much opposition
>> to it,
>
> I'm not opposed to PEP 340 in principle, but the
> ramifications seemed to be getting extraordinarily
> complicated, and it seems to be hamstrung by
> various backwards-compatibility constraints.
> E.g. it seems we can't make for-loops work the way
> they should in the face of generator finalisation
> or we break old code.

I think I zoned this part of the discussion out, but I've written code
like this:

lineiter = iter(aFile)

for line in lineiter:
    if sectionmarker in line:
        break
    parseSection1Line(line)

for line in lineiter:
    if sectionmarker in line:
        break
    parseSection2Line(line)

(though, not *quite* that regular...)

This is, to me, neat and clear.  I don't find the idea that iterators
are tied to exactly 1 for loop an improvement (even though they
usually will be).

Cheers,
mwh

-- 
  <thirmite> what's a web widget??
  <glyph> thirmite: internet on a stick, on fire
  <Acapnotic> with web sauce!
                                                -- from Twisted.Quotes

From pje at telecommunity.com  Wed May 18 19:20:43 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Wed, 18 May 2005 13:20:43 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
 generator exceptions and cleanup
In-Reply-To: <ca471dc2050518095515aadde9@mail.gmail.com>
References: <5.1.1.6.0.20050518124801.01eea0f0@mail.telecommunity.com>
	<ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050518124801.01eea0f0@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050518130957.032d26d8@mail.telecommunity.com>

At 09:55 AM 5/18/2005 -0700, Guido van Rossum wrote:
>[Phillip J. Eby]
> > And there was much rejoicing in the land of the co-routiney 
> people.  :)  +1000.
> >
> > Should this maybe just be added to PEP 342?  To me, PEP 342 has always
> > seemed incomplete without ways to throw() and close(), but that could
> > easily be just me.  In any case I'd expect the implementation of
> > 'next(arg)' to have some overlap with the implementation of 'throw()'.
>
>Maybe, but on the other hand this idea can be done independently from
>PEP 342. After the "monster-PEP" 340, I'd rather break proposals up in
>small parts.

Okay.  Maybe we should just update PEP 325, then?  It has much of the stuff 
that we'd want in the new PEP, such as the rationale.  Your new proposal, 
AFAICT, is just a simple extension of the PEP 325 protocol (i.e., adding 
'throw()'), along with some decisions to resolve its open issues.  Even the 
addition of 'throw()' seems tacitly approved by this bit at the end:

"""Were PEP 288 implemented, Exceptions Semantics for close could be 
layered on top of it"""

So at this point it seems your proposal is just nailing down specifics for 
the open parts of PEP 325.


From gvanrossum at gmail.com  Wed May 18 19:23:13 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Wed, 18 May 2005 10:23:13 -0700
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
	generator exceptions and cleanup
In-Reply-To: <5.1.1.6.0.20050518130957.032d26d8@mail.telecommunity.com>
References: <ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050518124801.01eea0f0@mail.telecommunity.com>
	<ca471dc2050518095515aadde9@mail.gmail.com>
	<5.1.1.6.0.20050518130957.032d26d8@mail.telecommunity.com>
Message-ID: <ca471dc20505181023723d0fc4@mail.gmail.com>

[Phillip J. Eby]
> Okay.  Maybe we should just update PEP 325, then?  It has much of the stuff
> that we'd want in the new PEP, such as the rationale.  Your new proposal,
> AFAICT, is just a simple extension of the PEP 325 protocol (i.e., adding
> 'throw()'), along with some decisions to resolve its open issues.  Even the
> addition of 'throw()' seems tacitly approved by this bit at the end:
> 
> """Were PEP 288 implemented, Exceptions Semantics for close could be
> layered on top of it"""
> 
> So at this point it seems your proposal is just nailing down specifics for
> the open parts of PEP 325.

Or PEP 288? That has throw() (albeit with a different signature). I
could do without the attributes though (PEP 342 provides a much better
solution IMO).

If either of those PEP authors feels like updating their PEP, they
have my blessings! I probably won't get to writing my own for a few
more days.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From python at rcn.com  Wed May 18 19:24:29 2005
From: python at rcn.com (Raymond Hettinger)
Date: Wed, 18 May 2005 13:24:29 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
	generatorexceptions and cleanup
In-Reply-To: <ca471dc2050518093929da936c@mail.gmail.com>
Message-ID: <001d01c55bce$7341a640$2db0958d@oemcomputer>

> I'd like to propose to make that a separate PEP, which can combine
> elements of PEP 288 and PEP 325. 

+1 
Overall, the combined PEP proposal looks pretty good.



> - g.throw(type, value, traceback) causes the specified exception to be
> thrown at the place where the generator g is currently suspended.

Are the value and traceback arguments optional as they are with the
current raise statement?  If they are optional, what would the default
be?  I think the preferred choice is to have the call to the throw
method be the anchor point.  That makes sense in a traceback so you can
see who threw the exception.

The alternative is to have the generator resumption point be the anchor.
That is closer to the notion that throw(ex) is equivalent to a "raise
ex" following the last yield.   This probably isn't the way to go but
the PEP should address it explicitly.



> If the generator raises another exception (this includes
> the StopIteration produced when it returns) that exception is raised
> by the throw. In summary, throw() behaves like next() except it raises
> an exception at the place of the yield.

The parallel to next() makes this easy to understand, learn, and
implement.  However, there are some disadvantages to passing through a
StopIteration.  It means that throw() calls usually need to be wrapped
in a try/except or that a generator's exception handler would terminate
with a "yield None" where a "return" would be more natural.  As a
example, it is a bit painful to simulate the effects of g.close() using
g.throw(GeneratorExit).




> That's it! With this, we can write the decorator from Nick's PEP 3XX
> and the generator examples in PEP 343 can be rewritten to have a
> try/finally clause around the yield statement.

Yea!  This is very nice.


 
> Oh, somewhere it should be stated that yield without an expression is
> equivalent to yield None. PEP 342 ("continue EXPR") already implies
> that, so we don't have to write a separate PEP for it.

Right.



> I also propose
> to go with the alternative in PEP 342 of using next() rather than
> __next__() -- generators will have methods next(), throw(), and
> close().

+0  The switch from __next__() to next() is attractive but not essential
to the proposal.  Besides a small cost to backwards compatability, it
introduces yet another new/old style distinction where we have to keep
both forms in perpetuity.



Raymond

From python at rcn.com  Wed May 18 19:27:19 2005
From: python at rcn.com (Raymond Hettinger)
Date: Wed, 18 May 2005 13:27:19 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP
	325:generator exceptions and cleanup
In-Reply-To: <ca471dc2050518095515aadde9@mail.gmail.com>
Message-ID: <001e01c55bce$d86addc0$2db0958d@oemcomputer>

> > Should this maybe just be added to PEP 342?  To me, PEP 342 has
always
> > seemed incomplete without ways to throw() and close(), but that
could
> > easily be just me.  In any case I'd expect the implementation of
> > 'next(arg)' to have some overlap with the implementation of
'throw()'.
> 
> Maybe, but on the other hand this idea can be done independently from
> PEP 342. After the "monster-PEP" 340, I'd rather break proposals up in
> small parts.

+1

I want this as a separate PEP.  It is a straight-forward solution to
long standing issues.  I would rather not have it contaminated with
distracting issues and co-routine dreams.


Raymond

From python at rcn.com  Wed May 18 19:28:20 2005
From: python at rcn.com (Raymond Hettinger)
Date: Wed, 18 May 2005 13:28:20 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
	generator exceptions and cleanup
In-Reply-To: <5.1.1.6.0.20050518130957.032d26d8@mail.telecommunity.com>
Message-ID: <001f01c55bce$fc976ce0$2db0958d@oemcomputer>

> Okay.  Maybe we should just update PEP 325, then?

-1.

Keep this separate.


Raymond

From gvanrossum at gmail.com  Wed May 18 19:46:40 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Wed, 18 May 2005 10:46:40 -0700
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
	generatorexceptions and cleanup
In-Reply-To: <001d01c55bce$7341a640$2db0958d@oemcomputer>
References: <ca471dc2050518093929da936c@mail.gmail.com>
	<001d01c55bce$7341a640$2db0958d@oemcomputer>
Message-ID: <ca471dc205051810461c81ec68@mail.gmail.com>

[Raymond Hettinger]
> Are the value and traceback arguments optional as they are with the
> current raise statement?  If they are optional, what would the default
> be?  I think the preferred choice is to have the call to the throw
> method be the anchor point.  That makes sense in a traceback so you can
> see who threw the exception.

AFAI throw() is concerned, the defaults are None. The raise statement
does something sane when the second and/or third arg are None (the
first can't be though).

> The alternative is to have the generator resumption point be the anchor.
> That is closer to the notion that throw(ex) is equivalent to a "raise
> ex" following the last yield.   This probably isn't the way to go but
> the PEP should address it explicitly.

It's actually kind of tricky since the exception will come *back* to
the throw point anyway. I think the traceback ought to start at the
resumption point by default.

> > If the generator raises another exception (this includes
> > the StopIteration produced when it returns) that exception is raised
> > by the throw. In summary, throw() behaves like next() except it raises
> > an exception at the place of the yield.
> 
> The parallel to next() makes this easy to understand, learn, and
> implement.  However, there are some disadvantages to passing through a
> StopIteration.  It means that throw() calls usually need to be wrapped
> in a try/except or that a generator's exception handler would terminate
> with a "yield None" where a "return" would be more natural.  As a
> example, it is a bit painful to simulate the effects of g.close() using
> g.throw(GeneratorExit).

Doesn't bother me; the main use case is in the do_template (or
with_template) decorator. Since it must support both raising an
exception and returning a value, we're pretty much forced to catch the
exception (unless we just want to pass it through, which is actually a
reasonable use case).

> > I also propose
> > to go with the alternative in PEP 342 of using next() rather than
> > __next__() -- generators will have methods next(), throw(), and
> > close().
> 
> +0  The switch from __next__() to next() is attractive but not essential
> to the proposal.  Besides a small cost to backwards compatability, it
> introduces yet another new/old style distinction where we have to keep
> both forms in perpetuity.

Right. PBP. :-)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From python at rcn.com  Wed May 18 19:59:48 2005
From: python at rcn.com (Raymond Hettinger)
Date: Wed, 18 May 2005 13:59:48 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
	generatorexceptions and cleanup
In-Reply-To: <ca471dc205051810461c81ec68@mail.gmail.com>
Message-ID: <002001c55bd3$61e5bbc0$2db0958d@oemcomputer>

> [Raymond Hettinger]
> > Are the value and traceback arguments optional as they are with the
> > current raise statement?  If they are optional, what would the
default
> > be?  I think the preferred choice is to have the call to the throw
> > method be the anchor point.  That makes sense in a traceback so you
can
> > see who threw the exception.
> 
> AFAI throw() is concerned, the defaults are None. The raise statement
> does something sane when the second and/or third arg are None (the
> first can't be though).
> 
> > The alternative is to have the generator resumption point be the
anchor.
> > That is closer to the notion that throw(ex) is equivalent to a
"raise
> > ex" following the last yield.   This probably isn't the way to go
but
> > the PEP should address it explicitly.
> 
> It's actually kind of tricky since the exception will come *back* to
> the throw point anyway. I think the traceback ought to start at the
> resumption point by default.
> 
> > > If the generator raises another exception (this includes
> > > the StopIteration produced when it returns) that exception is
raised
> > > by the throw. In summary, throw() behaves like next() except it
raises
> > > an exception at the place of the yield.
> >
> > The parallel to next() makes this easy to understand, learn, and
> > implement.  However, there are some disadvantages to passing through
a
> > StopIteration.  It means that throw() calls usually need to be
wrapped
> > in a try/except or that a generator's exception handler would
terminate
> > with a "yield None" where a "return" would be more natural.  As a
> > example, it is a bit painful to simulate the effects of g.close()
using
> > g.throw(GeneratorExit).
> 
> Doesn't bother me; the main use case is in the do_template (or
> with_template) decorator. Since it must support both raising an
> exception and returning a value, we're pretty much forced to catch the
> exception (unless we just want to pass it through, which is actually a
> reasonable use case).
> 
> > > I also propose
> > > to go with the alternative in PEP 342 of using next() rather than
> > > __next__() -- generators will have methods next(), throw(), and
> > > close().
> >
> > +0  The switch from __next__() to next() is attractive but not
essential
> > to the proposal.  Besides a small cost to backwards compatability,
it
> > introduces yet another new/old style distinction where we have to
keep
> > both forms in perpetuity.
> 
> Right. PBP. :-)


FWIW, I'm in agreement with everything.
I hope this one gets accepted.
Please do put it in a separate PEP.


Raymond

From pje at telecommunity.com  Wed May 18 20:32:58 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Wed, 18 May 2005 14:32:58 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
 generatorexceptions and cleanup
In-Reply-To: <001d01c55bce$7341a640$2db0958d@oemcomputer>
References: <ca471dc2050518093929da936c@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050518140611.01eeb260@mail.telecommunity.com>

At 01:24 PM 5/18/2005 -0400, Raymond Hettinger wrote:
> > - g.throw(type, value, traceback) causes the specified exception to be
> > thrown at the place where the generator g is currently suspended.
>
>Are the value and traceback arguments optional as they are with the
>current raise statement?  If they are optional, what would the default
>be?  I think the preferred choice is to have the call to the throw
>method be the anchor point.  That makes sense in a traceback so you can
>see who threw the exception.
>
>The alternative is to have the generator resumption point be the anchor.
>That is closer to the notion that throw(ex) is equivalent to a "raise
>ex" following the last yield.   This probably isn't the way to go but
>the PEP should address it explicitly.

My use case for throw() calls for the latter option; i.e., the exception is 
raised by the yield expression at the resumption point.  Keep in mind that 
if the exception passes out of the generator, the throw() call will show in 
the traceback anyway.  It's unlikely the generator itself will inspect the 
traceback and need to see the throw() call as if it were nested.


> > If the generator raises another exception (this includes
> > the StopIteration produced when it returns) that exception is raised
> > by the throw. In summary, throw() behaves like next() except it raises
> > an exception at the place of the yield.
>
>The parallel to next() makes this easy to understand, learn, and
>implement.  However, there are some disadvantages to passing through a
>StopIteration.  It means that throw() calls usually need to be wrapped
>in a try/except or that a generator's exception handler would terminate
>with a "yield None" where a "return" would be more natural.  As a
>example, it is a bit painful to simulate the effects of g.close() using
>g.throw(GeneratorExit).

I don't see this as a big problem, personally, but that's because all of my 
use cases for throw() will be using only one piece of code that calls 
throw(), and that code will be overall simplified by the availability of 
throw().

It's also easy to write a wrapper for control-flow "signals" of the kind 
you used in PEP 288:

     def send(gen, *exc):
         try:
             return gen.throw(*exc)
         except StopIteration:
             pass


From pje at telecommunity.com  Wed May 18 20:42:11 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Wed, 18 May 2005 14:42:11 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
 generator exceptions and cleanup
In-Reply-To: <001f01c55bce$fc976ce0$2db0958d@oemcomputer>
References: <5.1.1.6.0.20050518130957.032d26d8@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050518143502.01efc198@mail.telecommunity.com>

At 01:28 PM 5/18/2005 -0400, Raymond Hettinger wrote:
> > Okay.  Maybe we should just update PEP 325, then?
>
>-1.
>
>Keep this separate.

Have you read PEP 325 lately?  Mostly the change would consist of deleting 
rejected options or moving them to a rejected options section.  The only 
other change would be adding a short section stating how throw() would work 
and that it's being made public to support the future use of generators as 
flow-control templates.

A new PEP would have to copy, reinvent, or reference large chunks of PEP 
325, resulting in either redundancy or excess complexity.

Or are you suggesting a new PEP for throw(), containing *only* an 
explanation of its semantics, and then modifying PEP 325 to indicate that 
it will be implemented using the new PEP's 'throw()'?  That's about the 
only scenario that makes sense to me for adding a new PEP, because PEP 325 
is already pretty darn complete with respect to close() and GC.


From python at rcn.com  Wed May 18 22:05:26 2005
From: python at rcn.com (Raymond Hettinger)
Date: Wed, 18 May 2005 16:05:26 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP
	325:generator exceptions and cleanup
In-Reply-To: <ca471dc20505181023723d0fc4@mail.gmail.com>
Message-ID: <003201c55be4$ef148e20$2db0958d@oemcomputer>

> > So at this point it seems your proposal is just nailing down
specifics
> for
> > the open parts of PEP 325.
> 
> Or PEP 288? That has throw() (albeit with a different signature). I
> could do without the attributes though (PEP 342 provides a much better
> solution IMO).
> 
> If either of those PEP authors feels like updating their PEP, they
> have my blessings! I probably won't get to writing my own for a few
> more days.

Okay, I volunteer to recast PEP 288 to encompass your combined proposal.

Will tackle it in the morning.


Raymond

From mcherm at mcherm.com  Thu May 19 01:39:18 2005
From: mcherm at mcherm.com (Michael Chermside)
Date: Wed, 18 May 2005 16:39:18 -0700
Subject: [Python-Dev] Simpler finalization semantics (was Re: PEP 343
	-	Abstract Block Redux)
Message-ID: <20050518163918.5sr1j65p2txws4gk@login.werra.lunarpages.com>

Guido writes:

  [a rather silly objection to Phillip's proposal that 'with x:' is
   a no-op when x lacks __enter__ and __exit__]

> I know this is not a very strong argument, but my gut tells me this
> generalization of the with-statement is wrong, so I'll stick to it
> regardless of the strength of the argument. The real reason will come
> to me.

Perhaps the real reason is that it allows errors to pass silently.

If I write

    with foo:
       BLOCK

where I should have written

    with locked(foo):
       BLOCK

...it silently "succeeds" by doing nothing. I CLEARLY intended to
do the appropriate cleanup (or locking, or whatever), but it doesn't
happen.

-- Michael Chermside


From goodger at python.org  Thu May 19 01:59:48 2005
From: goodger at python.org (David Goodger)
Date: Wed, 18 May 2005 19:59:48 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
 generator exceptions and cleanup
In-Reply-To: <ca471dc2050518093929da936c@mail.gmail.com>
References: <ca471dc2050518093929da936c@mail.gmail.com>
Message-ID: <428BD6F4.9060108@python.org>

[Guido van Rossum]
> I believe that in the discussion about PEP 343 vs. Nick's PEP 3XX
> (http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html, still
> awaiting PEP moderator approval I believe?) ...

Nick hasn't submitted it for a PEP number yet.

--
David Goodger <http://python.net/~goodger>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 253 bytes
Desc: OpenPGP digital signature
Url : http://mail.python.org/pipermail/python-dev/attachments/20050518/b0a9cd38/signature.pgp

From nidoizo at yahoo.com  Tue May 17 07:58:12 2005
From: nidoizo at yahoo.com (Nicolas Fleury)
Date: Mon, 16 May 2005 22:58:12 -0700 (PDT)
Subject: [Python-Dev] Adding content to exception messages
Message-ID: <20050517055812.42184.qmail@web50908.mail.yahoo.com>

Sorry if this message is not a direct reply to Ka-Ping
Yee message on PEP344, I'm in vacation in China and
there's a thing I must say that could make sense in
PEP344. 

I do a lot of exception re-raising at work; I use that
technique to add content to exception messages while
keeping the original stack.  I even created a reraise
function that I use that way:

try:
    parser.parseFile(file)
exeption Exception, exception:
    reraise(exception, 
            "Error at line %s in file %s" % (x,y))

(x,y) are details, but you get the idea.

This very useful in many situations.  In the example,
it works when an error is happening in parsing a file
including other files (like xml files with
<xs:include>).  That way you get the exact path of
inclusion leading to the error.  It is also useful
when an error happen in very generic code and when
traceback is not enough to know which element was
causing the error.

What I propose is that all exceptions objects have a
standard way to add additional informations. 
Something like:

try: 
    foo()
except Exception, exception:
    exception.add_info("some info")
    raise exception from (whatever the proposition is)

You might ask, "why not just reraise a new
exception?".  It is more useful to reraise the same
exception type, making it possible to use selective
except clauses and (avoid problems with code using
them like hasattr).  I think what would be simpler is
not affect __str__ representation and prepend to a
list the additional infos inside the exception object,
adding a function to get these infos.  I don't mind
how it would be done in fact, as long as the need is
fulfilled.  I won't be able to read my messages often
for the next 10 days, but I hope others will see the
point I try to bring;)

Regards,
Nicolas



		
Discover Yahoo! 
Have fun online with music videos, cool games, IM and more. Check it out! 
http://discover.yahoo.com/online.html

From greg.ewing at canterbury.ac.nz  Thu May 19 05:31:23 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 19 May 2005 15:31:23 +1200
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
 generatorexceptions and cleanup
In-Reply-To: <5.1.1.6.0.20050518140611.01eeb260@mail.telecommunity.com>
References: <ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050518140611.01eeb260@mail.telecommunity.com>
Message-ID: <428C088B.1030507@canterbury.ac.nz>

Phillip J. Eby wrote:
> My use case for throw() calls for the latter option; i.e., the exception is 
> raised by the yield expression at the resumption point.  Keep in mind that 
> if the exception passes out of the generator, the throw() call will show in 
> the traceback anyway.  It's unlikely the generator itself will inspect the 
> traceback and need to see the throw() call as if it were nested.

There mightn't be much choice anyway. If the frame making
the call to throw() were to be made the starting point for
the traceback, and the exception propagated back to the
throw, something would try to put the same frame in the
traceback twice, which can't work since it's a linked
list.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Thu May 19 06:44:49 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 19 May 2005 16:44:49 +1200
Subject: [Python-Dev] PEP 343 - New kind of yield statement?
In-Reply-To: <2moebb4ijm.fsf@starship.python.net>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<42883C05.705@canterbury.ac.nz> <2moebb4ijm.fsf@starship.python.net>
Message-ID: <428C19C1.9000609@canterbury.ac.nz>

Michael Hudson wrote:

> This is, to me, neat and clear.  I don't find the idea that iterators
> are tied to exactly 1 for loop an improvement (even though they
> usually will be).

To fix this in a fully backward-compatible way, we
need some way of distinguishing generators that
expect to be finalized.

Suppose we leave the 'yield' statement alone, and
introduce a new statement 'suspend', which alone
has the new capabilities of

(1) allowing injection of exceptions
(2) ability to return a value
(3) permissibility in a try-finally

Doing throw() on a generator that is stopped at
a yield statement would simply raise the exception
without changing the state of the generator. So
the for-loop could be made to finalize by default,
and existing generators would be unaffected.

A with-statement generator would then look like

   @with_template
   def foo():
     initialize()
     try:
       suspend
     finally:
       finalize()

which I think looks quite nice, because 'suspend'
seems more suggestive of what is happening when
you're not yielding a value. The same thing applies
to coroutine-type applications.

For partial iteration of new-style generators,
there could be a new statement

   for var from expr:
     ...

or maybe just a wrapper function

   for var in partial(expr):
     ...

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Thu May 19 06:46:49 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 19 May 2005 16:46:49 +1200
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <ca471dc205051808487bc3e875@mail.gmail.com>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<20050516214618.GA23741@panix.com>
	<Pine.LNX.4.58.0505161956580.14555@server1.LFW.org>
	<20050517021147.GL20441@performancedrivers.com>
	<ca471dc2050516211225b24436@mail.gmail.com>
	<428A9269.6080805@canterbury.ac.nz>
	<ca471dc205051808487bc3e875@mail.gmail.com>
Message-ID: <428C1A39.3010401@canterbury.ac.nz>

Guido van Rossum wrote:
> Here's another rule-of-thumb: when the VM and the user *share* the
> attribute space of an object, the VM uses system attributes; the VM
> uses plain attributes for objects that it owns completely (like code
> objects, frames and so on, which rarely figure user code except for
> the explicit purpose of introspection). So I think the PEP should
> continue to use __traceback__ etc.

I was just thinking the same thing myself!

(Does Guido have a telepathy machine now, as well
as a time machine?)

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Thu May 19 08:09:33 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 19 May 2005 18:09:33 +1200
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
 generator exceptions and cleanup
In-Reply-To: <ca471dc2050518093929da936c@mail.gmail.com>
References: <ca471dc2050518093929da936c@mail.gmail.com>
Message-ID: <428C2D9D.2080803@canterbury.ac.nz>

Guido van Rossum wrote:

> - When a generator is GC'ed, its close() method is called (which is a
> no-op if it is already closed).

Does this mean that all generators will be ineligible
for cyclic garbage collection (since they implicitly
have something equivalent to a __del__ method)?

Other than that, all this looks good.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From ncoghlan at gmail.com  Thu May 19 10:57:38 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 19 May 2005 18:57:38 +1000
Subject: [Python-Dev] Adding content to exception messages
In-Reply-To: <20050517055812.42184.qmail@web50908.mail.yahoo.com>
References: <20050517055812.42184.qmail@web50908.mail.yahoo.com>
Message-ID: <428C5502.80503@gmail.com>

Nicolas Fleury wrote:
> I do a lot of exception re-raising at work; I use that
> technique to add content to exception messages while
> keeping the original stack.  I even created a reraise
> function that I use that way:
> 
> try:
>     parser.parseFile(file)
> exeption Exception, exception:
>     reraise(exception, 
>             "Error at line %s in file %s" % (x,y))
> 
> (x,y) are details, but you get the idea.
> 

With PEP 344, this could simply be:

   try:
       parser.parseFile(file)
   exeption Exception, exception:
       raise type(exception)("Error at line %s in file %s" % (x,y))

Introspectively,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From ncoghlan at gmail.com  Thu May 19 11:50:50 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 19 May 2005 19:50:50 +1000
Subject: [Python-Dev] Simpler finalization semantics (was Re: PEP 343
 -	Abstract Block Redux)
In-Reply-To: <20050518163918.5sr1j65p2txws4gk@login.werra.lunarpages.com>
References: <20050518163918.5sr1j65p2txws4gk@login.werra.lunarpages.com>
Message-ID: <428C617A.7000609@gmail.com>

Michael Chermside wrote:
> If I write
> 
>     with foo:
>        BLOCK
> 
> where I should have written
> 
>     with locked(foo):
>        BLOCK
> 
> ...it silently "succeeds" by doing nothing. I CLEARLY intended to
> do the appropriate cleanup (or locking, or whatever), but it doesn't
> happen.

Ah, thanks. Like Guido, I had something in the back of my head saying it didn't 
like the idea, but I couldn't figure out the reason. I think you just nailed it.

Plus, there is a nice alternative which is to provide a 'clean it up if it needs 
it' resource in the standard library:

   class resource(object):
       def __init__(self, obj):
           self.obj = obj
           self.enter = getattr(obj, "__enter__", None)
           self.exit = getattr(obj, "__exit__", None)

       def __enter__(self):
           if self.enter is not None:
               self.enter()
           # For consistency, always return the object
           return self.obj

       def __exit__(self, *exc_info):
           if self.exit is not None:
               self.exit(*exc_info)

Then 'I don't know if this needs cleaning up or not' can be written:

   with resource(foo):
       # If foo needs cleaning up, it will be.

A refinement would provide the option to specify the enter/exit methods directly:

   class resource(object):
       def __init__(self, obj, *other_args):
           self.obj = obj
           if other_args:
               if len(other_args) != 2:
                   raise TypeError("need 1 or 3 arguments")
               self.enter = args[0]
               self.exit = None
               self.exit_no_args = args[1]
           else:
               self.enter = getattr(obj, "__enter__", None)
               self.exit = getattr(obj, "__exit__", None)
               self.exit_no_args = None


       def __enter__(self):
           if self.enter is not None:
               self.enter()
           # For consistency, always return the object
           return self.obj

       def __exit__(self, *exc_info):
           if self.exit is not None:
              self.exit(*exc_info)
           elif self.exit_no_args is not None:
              self.exit()

That would let any object with a standard 'clean me up method' be easily used in 
a with statement:

   with resource(bar, None, bar.clear):
       # bar will be cleared when we're done

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From ncoghlan at gmail.com  Thu May 19 12:15:07 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 19 May 2005 20:15:07 +1000
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
 generator exceptions and cleanup
In-Reply-To: <ca471dc2050518093929da936c@mail.gmail.com>
References: <ca471dc2050518093929da936c@mail.gmail.com>
Message-ID: <428C672B.9070905@gmail.com>

Guido van Rossum wrote:
> I believe that in the discussion about PEP 343 vs. Nick's PEP 3XX
> (http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html, still
> awaiting PEP moderator approval I believe?)

It turns out my submission email took the scenic route, and I wasn't using a 
proper text editor so the formatting of the raw text version is messed up.

Given that I need to clean that formatting up, I figure I might as well update 
it in light of recent discussions before I resubmit it.

> the main difference is
> that Nick proposes a way to inject an exception into a generator; and
> I've said a few times that I like that idea.

If the current generator integration is dropped from PEP 343, I can rewrite my 
PEP to be *just* about the combination of PEP 288 and PEP 325 you describe, and 
use PEP 343 integration as a motivating use case.

The alternative would be to just submit and immediately withdraw it (since the 
user defined statement semantics now match PEP 343, and I basically like the 
generator interface you are proposing here, there wouldn't be much left of my 
PEP except for the big 'Rejected Options' section giving my understanding of the 
reasons we didn't take up various options).

<snip parts of the proposal I agree with completely>

> - g.close() throws a GeneratorExit exception in the generator, and
> catches it (so g.close() itself does not raise an exception).
> g.close() is idempotent -- if the generator is already closed, it is a
> no-op. If the generator, against the rules, yields another value, it
> is nevertheless marked closed.

Can't we make the method raise a RuntimeError when the generator breaks the 
rules? Or should we just print a warning instead (like the deletion code does if 
__del__ raises an exception)?

> - When a generator is GC'ed, its close() method is called (which is a
> no-op if it is already closed).

This is like giving it a __del__ method though, since it can now resurrect 
arbitrary objects from a cycle it is involved in. I think you made the right 
call elsewhere, when you convinced me that generator's shouldn't have a __del__ 
method any more than files should.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From python-dev at zesty.ca  Thu May 19 13:00:46 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Thu, 19 May 2005 06:00:46 -0500 (CDT)
Subject: [Python-Dev] Adding content to exception messages
In-Reply-To: <428C5502.80503@gmail.com>
References: <20050517055812.42184.qmail@web50908.mail.yahoo.com>
	<428C5502.80503@gmail.com>
Message-ID: <Pine.LNX.4.58.0505190557150.4932@server1.LFW.org>

On Thu, 19 May 2005, Nick Coghlan wrote:
> With PEP 344, this could simply be:
>
>    try:
>        parser.parseFile(file)
>    exeption Exception, exception:
>        raise type(exception)("Error at line %s in file %s" % (x,y))

Only if we also made all exceptions new-style classes.

That's just a minor nit, though.  The more important question to me is:

Do you care about the difference between a secondary exception that was
raised intentionally (as in your example) and a secondary exception due
to a problem in exception handling?

(For me, the answer is yes.)


-- ?!ng

From tdickenson at devmail.geminidataloggers.co.uk  Thu May 19 13:43:24 2005
From: tdickenson at devmail.geminidataloggers.co.uk (Toby Dickenson)
Date: Thu, 19 May 2005 12:43:24 +0100
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
Message-ID: <200505191243.24981.tdickenson@devmail.geminidataloggers.co.uk>

On Monday 16 May 2005 22:41, Ka-Ping Yee wrote:
>     http://www.python.org/peps/pep-0344.html

|     2.  Whenever an exception is raised, if the exception instance does
|         not already have a '__context__' attribute, the interpreter sets
|         it equal to the thread's exception context.

Should that be "if the exception instance does not already have a __context__ 
attribute or the value of that attribute is None...."

-- 
Toby Dickenson

From p.f.moore at gmail.com  Thu May 19 13:54:29 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Thu, 19 May 2005 12:54:29 +0100
Subject: [Python-Dev] PEP 343 - New kind of yield statement?
In-Reply-To: <428C19C1.9000609@canterbury.ac.nz>
References: <ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<42883C05.705@canterbury.ac.nz> <2moebb4ijm.fsf@starship.python.net>
	<428C19C1.9000609@canterbury.ac.nz>
Message-ID: <79990c6b0505190454703f67e3@mail.gmail.com>

On 5/19/05, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Michael Hudson wrote:
> 
> > This is, to me, neat and clear.  I don't find the idea that iterators
> > are tied to exactly 1 for loop an improvement (even though they
> > usually will be).
> 
> To fix this in a fully backward-compatible way, we
> need some way of distinguishing generators that
> expect to be finalized.

I don't see anything that needs to be "fixed" here. Sure, generators
that expect to be finalised will not be finalised simply by the fact
that a for loop exits, but that's fine - it's not part of the spec of
a for loop that it does finalise the generator. Adding that guarantee
to a for loop is a change in spec, not a fix.

Paul.

From pje at telecommunity.com  Thu May 19 14:42:38 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Thu, 19 May 2005 08:42:38 -0400
Subject: [Python-Dev] PEP 343 - New kind of yield statement?
In-Reply-To: <428C19C1.9000609@canterbury.ac.nz>
References: <2moebb4ijm.fsf@starship.python.net>
	<ca471dc205051317133cf8fd63@mail.gmail.com>
	<d64imp$fai$1@sea.gmane.org>
	<ca471dc205051410435473d2b2@mail.gmail.com>
	<42883C05.705@canterbury.ac.nz>
	<2moebb4ijm.fsf@starship.python.net>
Message-ID: <5.1.1.6.0.20050519084117.01f8d738@mail.telecommunity.com>

At 04:44 PM 5/19/2005 +1200, Greg Ewing wrote:
>Michael Hudson wrote:
>
> > This is, to me, neat and clear.  I don't find the idea that iterators
> > are tied to exactly 1 for loop an improvement (even though they
> > usually will be).
>
>To fix this in a fully backward-compatible way, we
>need some way of distinguishing generators that
>expect to be finalized.

No, we don't; Guido's existing proposal is quite sufficient for using 
yield.  We dont' need to create another old/new distinction here.


From pje at telecommunity.com  Thu May 19 14:43:44 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Thu, 19 May 2005 08:43:44 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
 generator exceptions and cleanup
In-Reply-To: <428C2D9D.2080803@canterbury.ac.nz>
References: <ca471dc2050518093929da936c@mail.gmail.com>
	<ca471dc2050518093929da936c@mail.gmail.com>
Message-ID: <5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>

At 06:09 PM 5/19/2005 +1200, Greg Ewing wrote:
>Guido van Rossum wrote:
>
> > - When a generator is GC'ed, its close() method is called (which is a
> > no-op if it is already closed).
>
>Does this mean that all generators will be ineligible
>for cyclic garbage collection (since they implicitly
>have something equivalent to a __del__ method)?

No, since it's implemented in C.  (The C equivalent to __del__ does not 
interfere with cyclic GC.)


From tlesher at gmail.com  Thu May 19 15:10:32 2005
From: tlesher at gmail.com (Tim Lesher)
Date: Thu, 19 May 2005 09:10:32 -0400
Subject: [Python-Dev] python-dev Summary for 2005-05-01 through 2005-05-15
	[draft]
Message-ID: <9613db60050519061018ae0b8@mail.gmail.com>

Here's the first draft of the python-dev summary for the first half of
May. Please send any corrections or suggestions to the summarizers (CC'ed).

======================
Summary Announcements
======================

----------------------------------------------
PEP 340 Episode 2: Revenge of the With (Block)
----------------------------------------------

This fortnight's Python-Dev was dominated again by another nearly 400
messages on the topic of anonymous block statements. The discussion
was a little more focused than the last thanks mainly to Guido's
introduction of `PEP 340`_. Discussion of this PEP resulted in a
series of other PEPs, including

* `PEP 342`_: Enhanced Iterators, which broke out into a separate
PEP the parts of `PEP 340`_ that allowed code to pass values into
iterators using ``continue EXPR`` and yield-expressions.

* `PEP 343`_: Anonymous Block Redux, a dramatically simplified
version of `PEP 340`_, which removed the looping nature of the
anonymous blocks and the injection-of-exceptions semantics for
generators.

* `PEP 3XX`_: User Defined ("with") Statements, which proposed
non-looping anonymous blocks accompanied by finalization semantics
for iterators and generators in for loops.

Various details of each of these proposals are discussed below in the
sections:

1. `Enhanced Iterators`_

2. `Separate APIs for Iterators and Anonymous Blocks`_

3. `Looping Anonymous Blocks`_

4. `Loop Finalization`_

At the time of this writing, it looked like the discussion was coming
very close to a final agreement; `PEP 343`_ and `PEP 3XX`_ both agreed
upon the same semantics for the block-statement, the keyword had been
narrowed down to either ``do`` or ``with``, and Guido had agreed to
add back in to `PEP 343`_ some form of exception-injection semantics
for generators.


.. _PEP 340: http://www.python.org/peps/pep-0340.html

.. _PEP 342: http://www.python.org/peps/pep-0342.html

.. _PEP 343: http://www.python.org/peps/pep-0343.html

.. _PEP 3XX: http://members.iinet.net.au/~ncoghlan/public/pep-3XX.html<http://members.iinet.net.au/%7Encoghlan/public/pep-3XX.html>

[SJB]


=========
Summaries
=========

------------------
Enhanced Iterators
------------------

`PEP 340`_ incorporated a variety of orthogonal features into a single
proposal. To make the PEP somewhat less monolithic, the method for
passing values into an iterator was broken off into `PEP 342`_. This
method includes:

* updating the iterator protocol to use .__next__() instead of .next()

* introducing a new builtin next()

* allowing continue-statements to pass values into iterators

* allowing generators to receive values with a yield-expression

Though these features had seemed mostly uncontroversial, Guido seemed
inclined to wait for a little more motivation from the co-routiney
people before accepting the proposal.

Contributing threads:

- `Breaking off Enhanced Iterators PEP from PEP 340 <
http://mail.python.org/pipermail/python-dev/2005-May/053463.html>`__

[SJB]


------------------------------------------------
Separate APIs for Iterators and Anonymous Blocks
------------------------------------------------

`PEP 340`_ had originally proposed to treat the anonymous block
protocol as an extension of the iterator protocol. Several problems
with this approach were raised, including:

* for-loops could accidentally be used with objects requiring blocks,
meaning that resources would not get cleaned up properly

* blocks could be used instead of for-loops, violating TOOWTDI

As a result, both `PEP 343`_ and `PEP 3XX`_ propose decorators for
generator functions that will wrap the generator object appropriately
to match the anonymous block protocol. Generator objects without the
proposed decorators would not be usable in anonymous block statements.

Contributing threads:

- `PEP 340 -- loose ends <
http://mail.python.org/pipermail/python-dev/2005-May/053206.html>`__
- `PEP 340 -- concept clarification <
http://mail.python.org/pipermail/python-dev/2005-May/053280.html>`__

[SJB]


------------------------
Looping Anonymous Blocks
------------------------

A few issues arose as a result of `PEP 340`_'s formulation of
anonymous blocks as a variation on a loop.

Because the anonymous blocks of `PEP 340`_ were defined in terms of
while-loops, there was some discussion as to whether they should have
an ``else`` clause like Python ``for`` and ``while`` loops do. There
didn't seem to be one obvious interpretation of an ``else`` block
though, so Guido rejected the ``else`` block proposal.

The big issue with looping anonymous blocks, however, was in the
handling of ``break`` and ``continue`` statements. Many use cases for
anonymous blocks did not require loops. However, because `PEP 340`_
anonymous blocks were implemented in terms of loops, ``break`` and
``continue`` acted much like they would in a loop. This meant that in
code like::

for item in items:
with lock:
if handle(item):
break

the ``break`` statement would only break out of the anonymous block
(the ``with`` statement) instead of breaking out of the for-loop. This
pretty much shot-down `PEP 340`_; there were too many cases where an
anonymous block didn't look like a loop, and having it behave like one
would have been a major stumbling block in learning the construct.

As a result, both `PEP 343`_ and `PEP 3XX`_ were proposed as
non-looping versions of `PEP 340`_.


Contributing threads:

- `PEP 340: Else clause for block statements <
http://mail.python.org/pipermail/python-dev/2005-May/053190.html>`__
- `PEP 340 -- loose ends <
http://mail.python.org/pipermail/python-dev/2005-May/053206.html>`__
- `PEP 340 -- concept clarification <
http://mail.python.org/pipermail/python-dev/2005-May/053226.html>`__
- `PEP 340: Breaking out. <
http://mail.python.org/pipermail/python-dev/2005-May/053223.html>`__
- `PEP 340: Non-looping version (aka PEP 310 redux) <
http://mail.python.org/pipermail/python-dev/2005-May/053400.html>`__
- `PEP 340 - Remaining issues <
http://mail.python.org/pipermail/python-dev/2005-May/053406.html>`__
- `PEP 340: Deterministic Finalisation (new PEP draft, either a competitor 
or update to PEP 340) <
http://mail.python.org/pipermail/python-dev/2005-May/053503.html>`__
- `Merging PEP 310 and PEP 340-redux? <
http://mail.python.org/pipermail/python-dev/2005-May/053591.html>`__
- `PEP 343 - Abstract Block Redux <
http://mail.python.org/pipermail/python-dev/2005-May/053731.html>`__

[SJB]


-----------------
Loop Finalization
-----------------

Greg Ewing pointed out that a generator with a yield inside a
block-statement would require additional work to guarantee its
finalization. For example, if the generator::

def all_lines(filenames):
for name in filenames:
with open(name) as f:
for line in f:
yield line 

were used in code like::

for line in all_lines(filenames):
if some_cond(line):
break

then unless the for-loop performed some sort of finalization on the
all_lines generator, the last-opened file could remain open
indefinitiely.

As a result, `PEP 3XX`_ proposes that for-loops check for a
__finish__() method on their iterators, and if one exists, call that
method when the for-loop completes. Generators like all_lines above,
that put a yield inside a block-statement, would then acquire a
__finish__() method that would raise a TerminateIteration exception
at the point of the last yield. The TerminateIteration exception would
thus cause the block-statement to complete, guaranteeing that the
generator was properly finalized.

Contributing threads:

- `PEP 340 - For loop cleanup, and feature separation <
http://mail.python.org/pipermail/python-dev/2005-May/053432.html>`__
- `PEP 340: Deterministic Finalisation (new PEP draft, either a competitor 
or update to PEP 340) <
http://mail.python.org/pipermail/python-dev/2005-May/053503.html>`__
- `PEP 343 - Abstract Block Redux <
http://mail.python.org/pipermail/python-dev/2005-May/053731.html>`__

[SJB]


----------------------------
Breaking out of Nested Loops
----------------------------

As a result of some of the issues of looping anonymous blocks, a few
threads discussed options for breaking out of nested loops. These
mainly worked by augmenting the ``break`` statement with another
keyword (or keywords) that would indicate which loop to break out of.

One proposal suggested that ``break`` be followed with ``for`` or
``while`` to indicate which loop to break out of. But ``break for``
would only really be useful in a while-loop nested within a for-loop,
and ``break while`` would only really be useful in a for-loop nested
within a while-loop. That is, because loops could only be named by
type, the proposal was only useful when loops of different types were
mixed. This suggestion was thus discarded as not being general enough.

A few other suggestions were briefly discussed: adding labels to
loops, using an integer to indicate which "stack level" to break at,
and pushing breaks onto a "break buffer", but Guido killed the
discussion, saying, `"Stop all discussion of breaking out of multiple
loops. It ain't gonna happen before my retirement."
<http://mail.python.org/pipermail/python-dev/2005-May/053592.html>`__

Contributing threads:

- `PEP 340: Breaking out. <
http://mail.python.org/pipermail/python-dev/2005-May/053223.html>`__
- `PEP 340: Deterministic Finalisation (new PEP draft, either a competitor 
or update to PEP 340) <
http://mail.python.org/pipermail/python-dev/2005-May/053503.html>`__

[SJB]


------------------------
The future of exceptions
------------------------

Ka-Ping Yee suggested that instead of passing (type, value, traceback)
tuples in exceptions it would be better to put the traceback in
value.traceback. Guido had also suggested this (in the `PEP 340`_ murk) but
pointed out that this would not work as long as string exceptions exist
(as there is nowhere to put the traceback).

Guido noted that there are no concrete plans as to when string exceptions
will be deprecated and removed (other than in 3.0 at the latest); he
indicated that it could be sooner, if someone wrote a PEP with a timeline
(e.g. deprecated in 2.5, gone in 2.6).

Brett C. volunteered to write a PEP targetted at Python 3000 covering
exception changes (base inheritance, standard attributes (e.g. .traceback),
reworking the built-in exception inheritance hierarchy, and the future of
bare except statements). 

Contributing threads:

- `Tidier Exceptions <
http://mail.python.org/pipermail/python-dev/2005-May/053671.html>`__

.. _PEP 340: http://www.python.org/peps/pep-0340.html

[TAM]


-----------------------------------
Unifying try/except and try/finally
-----------------------------------

Reinhold Birkenfeld submitted a Pre-PEP to allow both except and finally
clauses in try blocks. For example, a construction like::

try:
<suite 1>
except Ex1:
<suite 2>
<more except: clauses>
else:
<suite 3>
finally:
<suite 4>

would be exactly the same as the legacy::

try:
try:
<suite 1>
except Ex1:
<suite 2>
<more except: clauses>
else:
<suite 3>
finally:
<suite 4>

Guido liked this idea (so much that he wanted to accept it immediately),
and recommended that it was checked in as a PEP. However, Tim Peters
pointed out that this functionality was removed from Python (by Guido) way
back in 0.9.6, seemingly because there was confusion about exactly when
the finally clause would be called (explicit is better than implicit!).
Guido clarified that control would only pass forward, and indicated that
he felt that since this is now available in Java (and C#) fewer people
would be confused. The main concern about this change was that, while the
cost was low, it seemed to add very little value.

Contributing threads:

- `Pre-PEP: Unifying try-except and try-finally <
http://mail.python.org/pipermail/python-dev/2005-May/053290.html>`__

[TAM]


-----------------
Decorator Library
-----------------

Michele Simionato asked whether a module for commonly used decorators, or
utilities to create decorators, was planned. Raymond Hettinger indicated
that while this was likely in the long term, he felt that it was better if
these first evolved via wikis, recipes, or mailing lists, so that a module
would only be added once best practices and proven winners had emerged.
In the meantime, there is both a `Decorator Library wiki page`_ and
you can try out `Michele's library`_ [zip].

To assist with decorator creation, Michele would like a facility to copy a
function. Phillip J. Eby noted that the informally-discussed proposal is
to add a mutable __signature__ to functions to assist with signature
preserving decorators. Raymond suggested a patch adding a __copy__ method
to functions or a patch for the copy module, and Michele indicated that he
would also like to subclass FunctionType with an user-defined __copy__
method.

Contributing threads:

- `my first post: asking about a "decorator" module <
http://mail.python.org/pipermail/python-dev/2005-May/053316.html>`__
- `The decorator module <
http://mail.python.org/pipermail/python-dev/2005-May/053446.html>`__

.. _Decorator Library wiki page: 
http://www.python.org/moin/PythonDecoratorLibrary 
.. _Michele's library: 
http://www.phyast.pitt.edu/~micheles/python/decorator.zip<http://www.phyast.pitt.edu/%7Emicheles/python/decorator.zip>

[TAM]

---------------------
Hooking Py_FatalError
---------------------

Errors that invoke Py_FatalError generally signify that the internal state
of Python is in such a poor state that continuing (including raising an
exception) is impossible or unwise; as a result, Py_FatalError outputs the
error to stderr and calls abort(). m.u.k. would like to have a callback to
hook Py_FatalError to avoid this call to abort(). The general consensus
was that effort would be better directed to fixing the causes of fatal
errors than hooking Py_FatalError. m.u.k.'s use case was for generating
additional logging information; a `callback system patch`_ (revised by
James William Pye) is available for those interested.

Contributing threads:

- `Need to hook Py_FatalError <
http://mail.python.org/pipermail/python-dev/2005-May/053218.html>`__

.. _callback system patch: http://python.org/sf/1195571

-------------------
Chaining Exceptions
-------------------

Ka-Ping Yee suggested adding information to exceptions when they are raised
in the handler for another exception. For example::

def a():
try:
raise AError
except:
raise BError

raises an exception which is an instance of BError. This instance could
have an attribute which is instance of AError, containing information about
the original exception. Use cases include catching a low-level exception
(e.g. socket.error) and turning it into a high-level exception (e.g.
an HTTPRequestFailed exception) and handling problems in exception handling
code. Guido liked the idea, and discussion fleshed out a tighter
definition; however it was unclear whether adding this now was feasible -
this would perhaps be best added in Python 3000.

Contributing threads:

- `Chained Exceptions <
http://mail.python.org/pipermail/python-dev/2005-May/053672.html>`__

[TAM]

------------------
Py_UNICODE madness
------------------

Nicholas Bastin noted an apparent bug in the documentation for
Py_UNICODE, which states that Py_UNICODE is 16-bit value implemented
as an alias for wchar_t when that type is available, and for unsigned
short otherwise However, on recent Redhat releases, PY_UNICODE_SIZE
turns out to be 4. Guido and Marc-Andres Lemburg both agreed that the
documentation is incorrect, so Nicholas set out to fix the
documentation by removing the 16-bit reference and adding a caveat for
extension developers not to make assumptions about the size of Py_UNICODE.

It wasn't *quite* that simple. A long discussion ensued over why
Python sometimes needs to default to UCS-4 Unicode (to avoid
breaking an existing UCS-4 Tkinter), whether to expose the size of
Py_UNICODE to extension developers (they won't take 'no' for an
answer), and whether Python should provide better support for high
surrogate pairs (it should). The matter remained open.

Contributing threads:

- `Py_UNICODE madness <
http://mail.python.org/pipermail/python-dev/2005-May/053264.html>`__
- `New Py_UNICODE doc <
http://mail.python.org/pipermail/python-dev/2005-May/053311.html>`__
- `New Py_UNICODE doc (Another Attempt) <
http://mail.python.org/pipermail/python-dev/2005-May/053480.html>`__


[TDL]

------------------------------
Python's Unicode width default 
------------------------------

Marc-Andre Lemburg objected to Python's build process automatically
changing its default Unicode size from UCS2 to UCS4 at build time when
a UCS4 version of Tcl is found. Martin L?wis argued that having
Tkinter always work out of the box was more important than having a
hard-and-fast Unicode default configuration; Marc dissented.

Shane Hathaway opined that it could be a runtime rather than a
compile-time decision, and Bob Ippolito mentioned NSString from
OpenDarwin's libFoundation and CFString from Apple's CoreFoundation
libraries for implementation ideas.

Contributing threads:

- `Python's Unicode width default (New Py_UNICODE doc) <
http://mail.python.org/pipermail/python-dev/2005-May/053574.html>`__

[TDL]

===============
Skipped Threads
===============

- `Keyword for block statements <
http://mail.python.org/pipermail/python-dev/2005-May/053189.html>`__
- `PEP 340 - possible new name for block-statement <
http://mail.python.org/pipermail/python-dev/2005-May/053195.html>`__
- `Generating nested data structures with blocks <
http://mail.python.org/pipermail/python-dev/2005-May/053204.html>`__
- `PEP 340 -- Clayton's keyword? <
http://mail.python.org/pipermail/python-dev/2005-May/053377.html>`__
- `PEP 340: Only for try/finally? <
http://mail.python.org/pipermail/python-dev/2005-May/053258.html>`__
- `2 words keyword for block <
http://mail.python.org/pipermail/python-dev/2005-May/053251.html>`__
- `anonymous blocks <
http://mail.python.org/pipermail/python-dev/2005-May/053297.html>`__
- `"begin" as keyword for pep 340 <
http://mail.python.org/pipermail/python-dev/2005-May/053315.html>`__
- `PEP 340: propose to get rid of 'as' keyword <
http://mail.python.org/pipermail/python-dev/2005-May/053320.html>`__
- `PEP 340 keyword: after <
http://mail.python.org/pipermail/python-dev/2005-May/053396.html>`__
- `PEP 340 keyword: Extended while syntax <
http://mail.python.org/pipermail/python-dev/2005-May/053409.html>`__
- `PEP 340 - Remaining issues - keyword <
http://mail.python.org/pipermail/python-dev/2005-May/053428.html>`__
- `PEP 340: Examples as class's. <
http://mail.python.org/pipermail/python-dev/2005-May/053423.html>`__
- `Proposed alternative to __next__ and __exit__ <
http://mail.python.org/pipermail/python-dev/2005-May/053514.html>`__
- `"with" use case: exception chaining <
http://mail.python.org/pipermail/python-dev/2005-May/053665.html>`__
- `PEP 343: Resource Composition and Idempotent __exit__ <
http://mail.python.org/pipermail/python-dev/2005-May/053767.html>`__
- `[Python-checkins] python/nondist/peps pep-0343.txt, 1.8, 1.9 <
http://mail.python.org/pipermail/python-dev/2005-May/053766.html>`__
- `the current behavior of try: ... finally: <
http://mail.python.org/pipermail/python-dev/2005-May/053692.html>`__
- `a patch to inspect and a non-feature request <
http://mail.python.org/pipermail/python-dev/2005-May/053653.html>`__
- `Python 2.4 set objects and cyclic garbage <
http://mail.python.org/pipermail/python-dev/2005-May/053630.html>`__
- `CHANGE BayPIGgies: May *THIRD* Thurs <
http://mail.python.org/pipermail/python-dev/2005-May/053628.html>`__
- `Python continually calling sigprocmask() on FreeBSD 5 <
http://mail.python.org/pipermail/python-dev/2005-May/053615.html>`__
- `Weekly Python Patch/Bug Summary <
http://mail.python.org/pipermail/python-dev/2005-May/053213.html>`__
- `problems with memory management <
http://mail.python.org/pipermail/python-dev/2005-May/053408.html>`__
- `Adding DBL_MANTISSA and such to Python <
http://mail.python.org/pipermail/python-dev/2005-May/053372.html>`__
- `python-dev Summary for 2005-04-16 through 2005-04-30 [draft] <
http://mail.python.org/pipermail/python-dev/2005-May/053383.html>`__
- `Python Language track at Europython, still possibilities to submit talks 
<http://mail.python.org/pipermail/python-dev/2005-May/053303.html>`__
- `(no subject) <
http://mail.python.org/pipermail/python-dev/2005-May/053196.html>`__
- `Kernel panic writing to /dev/dsp with cmpci driver <
http://mail.python.org/pipermail/python-dev/2005-May/053627.html>`__


-- 
Tim Lesher <tlesher at gmail.com>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.python.org/pipermail/python-dev/attachments/20050519/3a87dc4d/attachment.htm

From facundobatista at gmail.com  Thu May 19 15:53:54 2005
From: facundobatista at gmail.com (Facundo Batista)
Date: Thu, 19 May 2005 10:53:54 -0300
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <000601c55b6f$43231fc0$ab29a044@oemcomputer>
References: <1f7befae0505172128f1f9daa@mail.gmail.com>
	<000601c55b6f$43231fc0$ab29a044@oemcomputer>
Message-ID: <e04bdf3105051906532a3b685d@mail.gmail.com>

On 5/18/05, Raymond Hettinger <python at rcn.com> wrote:


> >>> from decimal import getcontext
> >>> context = getcontext()
> >>> x = context.create_decimal('3.104')
> >>> y = context.create_decimal('2.104')
> >>> z = context.create_decimal('0.000')
> >>> context.prec = 3
> >>> x + y
> Decimal("5.21")
> >>> x + z + y
> Decimal("5.20")

My point here is to always remind everybody that Decimal solves the
problem with binary floating point, but not with representation
issues. If you don't have enough precision (for example to represent
one third), you'll get misterious results.

That's why, IMO, the Spec provides two traps, one for Rounded, and one
for Inexact, to be aware of what exactly is happening.


> As for why the normal Decimal constructor is context free, PEP 327
> indicates discussion on the subject, but who made the decision and why
> is not clear.

There was not decision. Originally the context didn't get applied in
creation time. And then, the situation arised where it would be nice
to be able to apply it in creation time (for situations when it would
be costly to not do it), so a method in the context was born.

.    Facundo

Blog: http://www.taniquetil.com.ar/plog/
PyAr: http://www.python.org/ar/

From facundobatista at gmail.com  Thu May 19 15:57:45 2005
From: facundobatista at gmail.com (Facundo Batista)
Date: Thu, 19 May 2005 10:57:45 -0300
Subject: [Python-Dev] Decimal construction
In-Reply-To: <20050518130555.GA9515@panix.com>
References: <E1DYALw-0001Rd-RB@sc8-pr-cvs1.sourceforge.net>
	<000101c55b5c$56f14b20$ab29a044@oemcomputer>
	<1f7befae0505172128f1f9daa@mail.gmail.com>
	<20050518130555.GA9515@panix.com>
Message-ID: <e04bdf3105051906571441de63@mail.gmail.com>

On 5/18/05, Aahz <aahz at pythoncraft.com> wrote:

> Not sure what the "right" answer is, but I wanted to stick my oar in to
> say that I think that Decimal has not been in the field long enough or
> widely-enough used that we should feel that the API has been set in
> stone.  If there's agreement that a mistake was made, let's fix it!

+1.

BTW, it's worth noting that for Money
(http://sourceforge.net/projects/pymoney) we decided to apply the
context at creation time....

.    Facundo

Blog: http://www.taniquetil.com.ar/plog/
PyAr: http://www.python.org/ar/

From python at rcn.com  Thu May 19 18:04:37 2005
From: python at rcn.com (Raymond Hettinger)
Date: Thu, 19 May 2005 12:04:37 -0400
Subject: [Python-Dev] Decimal construction
In-Reply-To: <20050518130555.GA9515@panix.com>
Message-ID: <001601c55c8c$755c7ea0$2db0958d@oemcomputer>

> Not sure what the "right" answer is, but I wanted to stick my oar in
to
> say that I think that Decimal has not been in the field long enough or
> widely-enough used that we should feel that the API has been set in
> stone.  If there's agreement that a mistake was made, let's fix it!

There is not agreement.  I prefer the current behavior and think
changing it would introduce more problems than it would solve.  Further,
the API currently provides both context aware and context free
construction -- all the tools needed are already there.  Let's leave
this alone and simply document the best practices (using unary plus
after a precision change and constructing using create_decimal whenever
context is important to construction).



Raymond

From foom at fuhm.net  Thu May 19 18:36:58 2005
From: foom at fuhm.net (James Y Knight)
Date: Thu, 19 May 2005 12:36:58 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
	generator exceptions and cleanup
In-Reply-To: <5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>
References: <ca471dc2050518093929da936c@mail.gmail.com>
	<ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>
Message-ID: <A512C75F-9F43-486C-B0A8-AFBC9BEF3B7B@fuhm.net>

On May 19, 2005, at 8:43 AM, Phillip J. Eby wrote:
> At 06:09 PM 5/19/2005 +1200, Greg Ewing wrote:
>> Guido van Rossum wrote:
>>> - When a generator is GC'ed, its close() method is called (which  
>>> is a
>>> no-op if it is already closed).
>>
>> Does this mean that all generators will be ineligible
>> for cyclic garbage collection (since they implicitly
>> have something equivalent to a __del__ method)?
>
> No, since it's implemented in C.  (The C equivalent to __del__ does  
> not
> interfere with cyclic GC.)

But you're missing the point -- there's a *reason* that __del__  
interferes with cyclic GC. It doesn't just do it for the heck of it!  
You can't simply have the C delete call into python code...the  
objects the generator has references to may be invalid objects  
already because they've been cleared to break a cycle. If you want to  
execute python code on collection of a generator, it must be done via  
__del__, or else it'll be horribly, horribly broken.

James

From pje at telecommunity.com  Thu May 19 18:59:39 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Thu, 19 May 2005 12:59:39 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
 generator exceptions and cleanup
In-Reply-To: <A512C75F-9F43-486C-B0A8-AFBC9BEF3B7B@fuhm.net>
References: <5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>
	<ca471dc2050518093929da936c@mail.gmail.com>
	<ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050519125314.01d9ba40@mail.telecommunity.com>

At 12:36 PM 5/19/2005 -0400, James Y Knight wrote:
>On May 19, 2005, at 8:43 AM, Phillip J. Eby wrote:
>>At 06:09 PM 5/19/2005 +1200, Greg Ewing wrote:
>>>Guido van Rossum wrote:
>>>>- When a generator is GC'ed, its close() method is called (which
>>>>is a
>>>>no-op if it is already closed).
>>>
>>>Does this mean that all generators will be ineligible
>>>for cyclic garbage collection (since they implicitly
>>>have something equivalent to a __del__ method)?
>>
>>No, since it's implemented in C.  (The C equivalent to __del__ does
>>not
>>interfere with cyclic GC.)
>
>But you're missing the point -- there's a *reason* that __del__
>interferes with cyclic GC. It doesn't just do it for the heck of it!
>You can't simply have the C delete call into python code...the
>objects the generator has references to may be invalid objects
>already because they've been cleared to break a cycle. If you want to
>execute python code on collection of a generator, it must be done via
>__del__, or else it'll be horribly, horribly broken.

Eeyowch.  Good point.  OTOH, the only way a generator-iterator can become 
part of a cycle is via  an action taken outside the generator.  (E.g. 
passing it into itself via 'continue', creating a link from one of its 
arguments to it, etc.)  So, it's probably not a terrible limitation in 
practice.


From gvanrossum at gmail.com  Thu May 19 19:01:43 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 19 May 2005 10:01:43 -0700
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
	generator exceptions and cleanup
In-Reply-To: <A512C75F-9F43-486C-B0A8-AFBC9BEF3B7B@fuhm.net>
References: <ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>
	<A512C75F-9F43-486C-B0A8-AFBC9BEF3B7B@fuhm.net>
Message-ID: <ca471dc2050519100126053782@mail.gmail.com>

[James Y Knigh]
> But you're missing the point -- there's a *reason* that __del__
> interferes with cyclic GC. It doesn't just do it for the heck of it!
> You can't simply have the C delete call into python code...the
> objects the generator has references to may be invalid objects
> already because they've been cleared to break a cycle. If you want to
> execute python code on collection of a generator, it must be done via
> __del__, or else it'll be horribly, horribly broken.

Thank you for reminding me -- that's indeed the reason, and it applies
here. I think in the past I've unsuccessfully tried to argue that if a
cycle contains exactly one object with a Python-invoking finalizer,
that finalizer could be invoked before breaking the cycle. I still
think that's a sensible proposal, and generators may be the use case
to finally implement it.

All this suggests that generators should indeed have a __del__()
method which is synonymous with close() (I want close() to be the
user-facing API).

BTW I think that close() and __del__() should raise an exception when
the throw(GeneratorExit) call doesn't end up either re-raising
GeneratorExit or raising StopIteration. The framework for calling
__del__() takes care of handling this exception (by printing and then
ignoring it). Raymond take notice if you're still working on the PEP.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From gvanrossum at gmail.com  Thu May 19 19:09:00 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 19 May 2005 10:09:00 -0700
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
	generator exceptions and cleanup
In-Reply-To: <5.1.1.6.0.20050519125314.01d9ba40@mail.telecommunity.com>
References: <ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>
	<A512C75F-9F43-486C-B0A8-AFBC9BEF3B7B@fuhm.net>
	<5.1.1.6.0.20050519125314.01d9ba40@mail.telecommunity.com>
Message-ID: <ca471dc20505191009361550b1@mail.gmail.com>

[Phillip J. Eby]the only way a generator-iterator can become
> part of a cycle is via  an action taken outside the generator.  (E.g.
> passing it into itself via 'continue', creating a link from one of its
> arguments to it, etc.)  So, it's probably not a terrible limitation in
> practice.

It's enough to store a reference to the generator in a global (or in
anything that's reachable from a global). The generator's frame's
f_globals pointer then ensures the cycle.

Throwing an exception also provides ample opportunity for creating
cycles, since the frame hold a reference to the most recent traceback.
Ironically, throwing an exception with a traceback into a generator is
likely to cause a cycle because the traceback likely references the
throwing frame, which certainly has a reference to the generator...

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From python at rcn.com  Thu May 19 19:16:36 2005
From: python at rcn.com (Raymond Hettinger)
Date: Thu, 19 May 2005 13:16:36 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP
	325:generator exceptions and cleanup
In-Reply-To: <ca471dc2050519100126053782@mail.gmail.com>
Message-ID: <001a01c55c96$8398b4c0$2db0958d@oemcomputer>

> BTW I think that close() and __del__() should raise an exception when
> the throw(GeneratorExit) call doesn't end up either re-raising
> GeneratorExit or raising StopIteration. The framework for calling
> __del__() takes care of handling this exception (by printing and then
> ignoring it). Raymond take notice if you're still working on the PEP.

Got it.


Raymond

From pje at telecommunity.com  Thu May 19 19:38:46 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Thu, 19 May 2005 13:38:46 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
 generator exceptions and cleanup
In-Reply-To: <ca471dc20505191009361550b1@mail.gmail.com>
References: <5.1.1.6.0.20050519125314.01d9ba40@mail.telecommunity.com>
	<ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>
	<A512C75F-9F43-486C-B0A8-AFBC9BEF3B7B@fuhm.net>
	<5.1.1.6.0.20050519125314.01d9ba40@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050519132640.03426c20@mail.telecommunity.com>

At 10:09 AM 5/19/2005 -0700, Guido van Rossum wrote:
>[Phillip J. Eby]the only way a generator-iterator can become
> > part of a cycle is via  an action taken outside the generator.  (E.g.
> > passing it into itself via 'continue', creating a link from one of its
> > arguments to it, etc.)  So, it's probably not a terrible limitation in
> > practice.
>
>It's enough to store a reference to the generator in a global (or in
>anything that's reachable from a global). The generator's frame's
>f_globals pointer then ensures the cycle.

Well, if the generator was defined in the same module, I suppose.  This 
would probably only happen with short scripts, where the lack of GC is 
unlikely to be an issue.

However, at least it would also be possible to explicitly close the 
generator, which wasn't possible before.


>Throwing an exception also provides ample opportunity for creating
>cycles, since the frame hold a reference to the most recent traceback.
>Ironically, throwing an exception with a traceback into a generator is
>likely to cause a cycle because the traceback likely references the
>throwing frame, which certainly has a reference to the generator...

*head exploding*  Double ouch.

Wait a minute...  those cycles don't include the generator, do they?  Let 
me think.  Frame A has a reference to the generator iterator, and invokes 
throw() (directly or indirectly) on it.  Frame B, the generator frame, gets 
its f_back set to point to Frame A, but presumably that link is cleared on 
exit?  (If it isn't, it probably should be).

Anyway, frame B throws an exception, and the traceback is created.  The 
traceback has a reference to frame B.  We return to frame A, and add it to 
the traceback as well, and a reference to the traceback goes into the frame 
too.  Hm.  Still no cycle passing through the *generator iterator*, unless 
the generator's frame's f_back is still pointing to the frame that it was 
last called from.  This holds even if the generator's frame holds the 
traceback that was current at the time of the error, because that traceback 
only includes the generator's frame, not the caller's frame.

So, as long as the generator guarantees its frame's f_back is empty while 
the generator is not actually executing, the cycle should not include the 
generator, so it will still be GC'able.  (Note, by the way, that this 
property is probably another good reason for using the yield expression as 
the source location for a throw()!)


From gvanrossum at gmail.com  Thu May 19 19:38:49 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 19 May 2005 10:38:49 -0700
Subject: [Python-Dev] Fwd: Combining the best of PEP 288 and PEP 325:
	generator exceptions and cleanup
In-Reply-To: <1f7befae05051910244ae85c38@mail.gmail.com>
References: <ca471dc2050518093929da936c@mail.gmail.com>
	<428C2D9D.2080803@canterbury.ac.nz>
	<5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>
	<ca471dc205051909185f56850a@mail.gmail.com>
	<1f7befae05051910244ae85c38@mail.gmail.com>
Message-ID: <ca471dc2050519103848911dc9@mail.gmail.com>

Here's the word on GC vs. __del__, by the way.

---------- Forwarded message ----------
From: Tim Peters <tim.peters at gmail.com>
Date: May 19, 2005 10:24 AM
Subject: Re: [Python-Dev] Combining the best of PEP 288 and PEP 325:
generator exceptions and cleanup
To: guido at python.org


> Remind me. Why again is GC afraid to touch objects with a __del__?
> (There's a good reason, it's so subtle I keep forgetting it and I
> can't seem to reconstruct the argument from first principles this
> morning.)

tp_clear is called on the objects in a cycle in an arbitrary order,
and it's possible for a __del__ method to (e.g.) resurrect any object
in the cycle.  But obj.tp_clear() doesn't necessarily leave obj in a
sane state, so we could end up resurrecting insane objects.

> Would the same reasoning apply to a generator that's part of a cycle if
> deleting the generator would cause more Python code to run?

The general rule now is that gc must guarantee that no object it
decided is trash can be reachable from any Python code by the time a
gc pass first calls tp_clear.  Calling tp_clear can certainly trigger
__del__ methods (and weakref callbacks), so it's not (a common
misunderstanding) the rule that __del__ methods can't run at all
during gc.  The real rule is subtler than that:  gc is happy to run a
__del__ method (or wr callback), provided that no trash is reachable
from such a method.

I haven't had the bandwidth to follow this discussion, but I sure
_suppose_ that other trash objects could be reachable from a trash
generator in a cycle.


-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From tim.peters at gmail.com  Thu May 19 19:44:42 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Thu, 19 May 2005 13:44:42 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
	generator exceptions and cleanup
In-Reply-To: <ca471dc2050519100126053782@mail.gmail.com>
References: <ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>
	<A512C75F-9F43-486C-B0A8-AFBC9BEF3B7B@fuhm.net>
	<ca471dc2050519100126053782@mail.gmail.com>
Message-ID: <1f7befae0505191044522c7647@mail.gmail.com>

[Guido]
> ...
> I think in the past I've unsuccessfully tried to argue that if a
> cycle contains exactly one object with a Python-invoking finalizer,
> that finalizer could be invoked before breaking the cycle. I still
> think that's a sensible proposal, and generators may be the use case
> to finally implement it.

You have argued it, and I've agreed with it.  The primary hangup is
that there's currently no code capable of doing it.  gc currently
determines the set of objects that must be part of cyclic trash, or
reachable only from cyclic trash, but has no relevant knowledge beyond
that.  For example, it doesn't know the difference between an object
that's in a trash cycle, and an object that's not in a trash cycle but
is reachable only from trash cycles.  In fact, it doesn't know
anything about the cycle structure.  That would require some sort of
new SCC (strongly connected component) analysis.

The graph derived from an arbitrary object graph by considering each
SCC to be "a node" is necessarily a DAG (contains no cycles), and the
general way to approach what you want here is to clear trash in a
topological sort of the SCC DAG:  so long as an SCC contains only one
object that may execute Python code, it's safe to run that object's
cleanup code first (for a meaning of "safe" that may not always
coincide with "explainable" or "predictable" <0.9 wink>).  gc would
probably need to give up after the first such thingie is run, if any
SCCs are reachable from the SCC X containing that thingie (gc can no
longer be sure that successor SCCs _are_ still trash:  they too are
reachable from X, so may have been resurrected by the Python code X
ran).

There's currently no forcing at all of the order in which tp_clear
gets called, and currently no analysis done sufficient to support
forcing a relevant ordering.

From gvanrossum at gmail.com  Thu May 19 19:48:13 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 19 May 2005 10:48:13 -0700
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
	generator exceptions and cleanup
In-Reply-To: <5.1.1.6.0.20050519132640.03426c20@mail.telecommunity.com>
References: <ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>
	<A512C75F-9F43-486C-B0A8-AFBC9BEF3B7B@fuhm.net>
	<5.1.1.6.0.20050519125314.01d9ba40@mail.telecommunity.com>
	<ca471dc20505191009361550b1@mail.gmail.com>
	<5.1.1.6.0.20050519132640.03426c20@mail.telecommunity.com>
Message-ID: <ca471dc20505191048192060ea@mail.gmail.com>

[Phillip J. Eby]
> >Throwing an exception also provides ample opportunity for creating
> >cycles, since the frame hold a reference to the most recent traceback.
> >Ironically, throwing an exception with a traceback into a generator is
> >likely to cause a cycle because the traceback likely references the
> >throwing frame, which certainly has a reference to the generator...
> 
> *head exploding*  Double ouch.
> 
> Wait a minute...  those cycles don't include the generator, do they?  Let
> me think.  Frame A has a reference to the generator iterator, and invokes
> throw() (directly or indirectly) on it.  Frame B, the generator frame, gets
> its f_back set to point to Frame A, but presumably that link is cleared on
> exit?  (If it isn't, it probably should be).
> 
> Anyway, frame B throws an exception, and the traceback is created.  The
> traceback has a reference to frame B.  We return to frame A, and add it to
> the traceback as well, and a reference to the traceback goes into the frame
> too.  Hm.  Still no cycle passing through the *generator iterator*, unless
> the generator's frame's f_back is still pointing to the frame that it was
> last called from.  This holds even if the generator's frame holds the
> traceback that was current at the time of the error, because that traceback
> only includes the generator's frame, not the caller's frame.
> 
> So, as long as the generator guarantees its frame's f_back is empty while
> the generator is not actually executing, the cycle should not include the
> generator, so it will still be GC'able.  (Note, by the way, that this
> property is probably another good reason for using the yield expression as
> the source location for a throw()!)

Hm. The way I see it, as soon as a generator raises an exception, its
frame is part of a cycle: the frame's f_exc_traceback points to the
traceback object, and the traceback object's tb_frame points back to
the frame. So that's a cycle right there.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Thu May 19 20:00:15 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Thu, 19 May 2005 14:00:15 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
 generator exceptions and cleanup
In-Reply-To: <ca471dc20505191048192060ea@mail.gmail.com>
References: <5.1.1.6.0.20050519132640.03426c20@mail.telecommunity.com>
	<ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>
	<A512C75F-9F43-486C-B0A8-AFBC9BEF3B7B@fuhm.net>
	<5.1.1.6.0.20050519125314.01d9ba40@mail.telecommunity.com>
	<ca471dc20505191009361550b1@mail.gmail.com>
	<5.1.1.6.0.20050519132640.03426c20@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050519135655.01d99ab8@mail.telecommunity.com>

At 10:48 AM 5/19/2005 -0700, Guido van Rossum wrote:
>Hm. The way I see it, as soon as a generator raises an exception, its
>frame is part of a cycle: the frame's f_exc_traceback points to the
>traceback object, and the traceback object's tb_frame points back to
>the frame. So that's a cycle right there.

But that cycle doesn't include the generator-iterator object, and it's not 
a collectable cycle while the iterator still lives.  Once the iterator 
itself goes away, that frame cycle is collectable.

However, Tim's new post brings up a different issue: if the collector can't 
tell the difference between a cycle participant and an object that's only 
reachable from a cycle, then the mere existence of a generator __del__ will 
prevent the cycle collection of the entire traceback/frame system that 
includes a generator-iterator reference anywhere!  And that's a pretty 
serious problem.


From mwh at python.net  Thu May 19 20:18:36 2005
From: mwh at python.net (Michael Hudson)
Date: Thu, 19 May 2005 19:18:36 +0100
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
 generator exceptions and cleanup
In-Reply-To: <5.1.1.6.0.20050519135655.01d99ab8@mail.telecommunity.com>
	(Phillip J. Eby's message of "Thu, 19 May 2005 14:00:15 -0400")
References: <5.1.1.6.0.20050519132640.03426c20@mail.telecommunity.com>
	<ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>
	<A512C75F-9F43-486C-B0A8-AFBC9BEF3B7B@fuhm.net>
	<5.1.1.6.0.20050519125314.01d9ba40@mail.telecommunity.com>
	<ca471dc20505191009361550b1@mail.gmail.com>
	<5.1.1.6.0.20050519132640.03426c20@mail.telecommunity.com>
	<5.1.1.6.0.20050519135655.01d99ab8@mail.telecommunity.com>
Message-ID: <2md5rn2i6b.fsf@starship.python.net>

"Phillip J. Eby" <pje at telecommunity.com> writes:

> However, Tim's new post brings up a different issue: if the collector can't 
> tell the difference between a cycle participant and an object that's only 
> reachable from a cycle,

Uh, that's not what he meant:

/>> class C:
|..  def __del__(self):
|..   print 'bye'
\__ 
->> a = [C()]
->> a.append(a)
->> del a
->> gc.collect()
bye
1

Cheers,
mwh


-- 
  Now this is what I don't get.  Nobody said absolutely anything
  bad about anything.  Yet it is always possible to just pull
  random flames out of ones ass.
         -- http://www.advogato.org/person/vicious/diary.html?start=60

From pje at telecommunity.com  Thu May 19 20:27:08 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Thu, 19 May 2005 14:27:08 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
 generator exceptions and cleanup
In-Reply-To: <2md5rn2i6b.fsf@starship.python.net>
References: <5.1.1.6.0.20050519135655.01d99ab8@mail.telecommunity.com>
	<5.1.1.6.0.20050519132640.03426c20@mail.telecommunity.com>
	<ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>
	<A512C75F-9F43-486C-B0A8-AFBC9BEF3B7B@fuhm.net>
	<5.1.1.6.0.20050519125314.01d9ba40@mail.telecommunity.com>
	<ca471dc20505191009361550b1@mail.gmail.com>
	<5.1.1.6.0.20050519132640.03426c20@mail.telecommunity.com>
	<5.1.1.6.0.20050519135655.01d99ab8@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050519142552.01d9a208@mail.telecommunity.com>

At 07:18 PM 5/19/2005 +0100, Michael Hudson wrote:
>"Phillip J. Eby" <pje at telecommunity.com> writes:
>
> > However, Tim's new post brings up a different issue: if the collector 
> can't
> > tell the difference between a cycle participant and an object that's only
> > reachable from a cycle,
>
>Uh, that's not what he meant:
>
>/>> class C:
>|..  def __del__(self):
>|..   print 'bye'
>\__
>->> a = [C()]
>->> a.append(a)
>->> del a
>->> gc.collect()
>bye
>1

Now you've shaken my faith in Uncle Timmy.  :)  Seriously, he did *say*:

"""For example, it doesn't know the difference between an object
that's in a trash cycle, and an object that's not in a trash cycle but
is reachable only from trash cycles."""

So now I wonder what he *did* mean.


From tim.peters at gmail.com  Thu May 19 20:30:53 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Thu, 19 May 2005 14:30:53 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
	generator exceptions and cleanup
In-Reply-To: <5.1.1.6.0.20050519135655.01d99ab8@mail.telecommunity.com>
References: <ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>
	<A512C75F-9F43-486C-B0A8-AFBC9BEF3B7B@fuhm.net>
	<5.1.1.6.0.20050519125314.01d9ba40@mail.telecommunity.com>
	<ca471dc20505191009361550b1@mail.gmail.com>
	<5.1.1.6.0.20050519132640.03426c20@mail.telecommunity.com>
	<ca471dc20505191048192060ea@mail.gmail.com>
	<5.1.1.6.0.20050519135655.01d99ab8@mail.telecommunity.com>
Message-ID: <1f7befae05051911303257c535@mail.gmail.com>

[Phillip J. Eby]
> ...
> However, Tim's new post brings up a different issue: if the collector can't
> tell the difference between a cycle participant and an object that's only
> reachable from a cycle, then the mere existence of a generator __del__ will
> prevent the cycle collection of the entire traceback/frame system that
> includes a generator-iterator reference anywhere!  And that's a pretty
> serious problem.

It's not that simple <wink>.  If an object with a __del__ is not part
of a cycle, but is reachable only from trash cycles, that __del__ does
not inhibit garbage collection.  Like:

    A<->B -> C -> D -> E

where C, D and E have __del__, but A and B don't, and all are trash.

Relatively early on, gc "moves" C, D, and E into a special
"finalizers" list, and doesn't look at this list again until near the
end.  Then A.tp_clear() and B.tp_clear() are called in some order.  As
a *side effect* of calling B.tp_clear(), C's refcount falls to 0, and
Python's normal refcount-based reclamation (probably) recovers all of
C, D and E, and runs their __del__ methods.  Note that refcount-based
reclamation necessarily follows a DAG order:  E is still intact when
D.__del__ is called, and likewise D is still intact when C.__del__ is
called.  It's possible that C.__del__ will resurrect D and/or E, and
that D.__del__ will resurrect E.  In such cases, D and/or E's
refcounts don't fall to 0, and their __del__ methods won't be called
then.

Cyclic gc doesn't force any of that, though -- it's all a side effect
of the clear() in gcmodule.c's:

		if ((clear = op->ob_type->tp_clear) != NULL) {
			Py_INCREF(op);
			clear(op);
			Py_DECREF(op);

In turn, one of A and B get reclaimed as a side effect of the
Py_DECREF there -- it's one of the delights of gcmodule.c that if you
don't know the trick, you can stare at it for hours and never discover
where exactly it is anything gets released <0.9 wink>.  In fact, it
doesn't release anything directly -- "all it really does" now is break
reference cycles, so that Py_DECREF can do its end-of-life thing.

From tim.peters at gmail.com  Thu May 19 20:57:32 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Thu, 19 May 2005 14:57:32 -0400
Subject: [Python-Dev] Combining the best of PEP 288 and PEP 325:
	generator exceptions and cleanup
In-Reply-To: <5.1.1.6.0.20050519142552.01d9a208@mail.telecommunity.com>
References: <ca471dc2050518093929da936c@mail.gmail.com>
	<5.1.1.6.0.20050519084306.03d9be30@mail.telecommunity.com>
	<A512C75F-9F43-486C-B0A8-AFBC9BEF3B7B@fuhm.net>
	<5.1.1.6.0.20050519125314.01d9ba40@mail.telecommunity.com>
	<ca471dc20505191009361550b1@mail.gmail.com>
	<5.1.1.6.0.20050519132640.03426c20@mail.telecommunity.com>
	<5.1.1.6.0.20050519135655.01d99ab8@mail.telecommunity.com>
	<2md5rn2i6b.fsf@starship.python.net>
	<5.1.1.6.0.20050519142552.01d9a208@mail.telecommunity.com>
Message-ID: <1f7befae05051911575c583305@mail.gmail.com>

[Phillip J. Eby]
> Now you've shaken my faith in Uncle Timmy.  :)

Now, now, a mere technical matter is no cause for soul-damning heresy!

>  Seriously, he did *say*:
> 
> """For example, it doesn't know the difference between an object
> that's in a trash cycle, and an object that's not in a trash cycle but
> is reachable only from trash cycles."""
>
> So now I wonder what he *did* mean.

What I said, of course ;-)  I hope my later email clarified it.  gc
knows which trash objects have __del__ methods, and which don't. 
That's all it needs to know so that a __del__ method on an object
that's not in a trash cycle but is reachable only from trash cycles
will get reclaimed (provided that no __del__ method on a predecessor
object that's not in a trash cycle but is reachable only from trash
cycles resurrects it).  gc doesn't know whether the set of objects it
_intends_ to call tp_clear on are or aren't in cycles, but all objects
directly in __del__ free cycles are included in that set.  That's
enough to ensure that trash "hanging off of them" sees its refcounts
fall to 0 as gc applies tp_clear to the objects in that set.  Note
that this set can mutate as gc goes along:  calling tp_clear on one
object can (also as a side effect of refcounts falling to 0) remove
any number of other objects from that set (that's why I said "intends"
above:  there's no guarantee that gc will end up calling tp_clear on
any object other than "the first" one in the set, where "the first" is
utterly arbitrary now).

If an object in a trash cycle has a __del__ method, this is why the
cycle won't be reclaimed:  all trash objects with __del__ methods, and
everything transitively reachable from them, are moved to the
"finalizers" list early on.  If that happens to include a trash cycle
C, then all of C ends up in the "finalizers" list, and no amount of
tp_clear'ing on the objects that remain can cause the refcount on any
object in C to fall to 0.  gc has no direct knowledge of cycles in
this case either.

From tim.peters at gmail.com  Fri May 20 02:50:42 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Thu, 19 May 2005 20:50:42 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <000601c55b6f$43231fc0$ab29a044@oemcomputer>
References: <1f7befae0505172128f1f9daa@mail.gmail.com>
	<000601c55b6f$43231fc0$ab29a044@oemcomputer>
Message-ID: <1f7befae05051917503bcbe4c5@mail.gmail.com>

[Raymond Hettinger]
> For brevity, the above example used the context free
> constructor, but the point was to show the consequence
> of a precision change.

Yes, I understood your point.  I was making a different point: 
"changing precision" isn't needed _at all_ to get surprises from a
constructor that ignores context.  Your example happened to change
precision, but that wasn't essential to getting surprised by feeding
strings to a context-ignoring Decimal constructor.  In effect, this
creates the opportunity for everyone to get suprised by something only
experts should need to deal with.

There seems to be an unspoken "wow that's cool!" kind of belief that
because Python's Decimal representation is _potentially_ unbounded,
the constructor should build an object big enough to hold any argument
exactly (up to the limit of available memory).  And that would be
appropriate for, say, an unbounded rational type -- and is appropriate
for Python's unbounded integers.

But Decimal is a floating type with fixed (albeit user-adjustable)
precision, and ignoring that mixes arithmetic models in a
fundamentally confusing way.  I would have no objection to a named
method that builds a "big as needed to hold the input exactly" Decimal
object, but it shouldn't be the behavior of the
everyone-uses-it-constructor.  It's not an oversight that the IBM
standard defines no operations that ignore context (and note that
string->float is a standard operation):  it's trying to provide a
consistent arithmetic, all the way from input to output.  Part of
consistency is applying "the rules" everywhere, in the absence of
killer-strong reasons to ignore them.

Back to your point, maybe you'd be happier if a named (say)
apply_context() method were added?  I agree unary plus is a
funny-looking way to spell it (although that's just another instance
of applying the same rules to all operations).

From python at rcn.com  Fri May 20 04:27:01 2005
From: python at rcn.com (Raymond Hettinger)
Date: Thu, 19 May 2005 22:27:01 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <e04bdf3105051906532a3b685d@mail.gmail.com>
Message-ID: <000901c55ce3$819849e0$2db0958d@oemcomputer>

[Tim suggesting that I'm clueless and dazzled by sparkling lights]
> There seems to be an unspoken "wow that's cool!" kind of belief 
> that because Python's Decimal representation is _potentially_ 
> unbounded, the constructor should build an object big enough to 
> hold any argument exactly (up to the limit of available memory).
> And that would be appropriate for, say, an unbounded rational 
> type -- and is appropriate for Python's unbounded integers.

I have no such thoughts but do strongly prefer the current design. I
recognize that it allows a user to specify an input at a greater
precision than the current context (in fact, I provided the example).

The overall design of the module and the spec is to apply context to the
results of operations, not their inputs.  In particular, the spec
recognizes that contexts can change and rather than specifying automatic
or implicit context application to all existing values, it provides the
unary plus operation so that such an application is explicit.  The use
of extra digits in a calculation is not invisible as the calculation
will signal Rounded and Inexact (if non-zero digits are thrown away).

One of the original motivating examples was "schoolbook" arithmetic
where the input string precision is incorporated into the calculation.
IMO, input truncation/rounding is inconsistent with that motivation.
Likewise, input rounding runs contrary to the basic goal of eliminating
representation error.

With respect to integration with the rest of Python (everything beyond
that spec but needed to work with it), I suspect that altering the
Decimal constructor is fraught with issues such as the
string-to-decimal-to-string roundtrip becoming context dependent.  I
haven't thought it through yet but suspect that it does not bode well
for repr(), pickling, shelving, etc.  Likewise, I suspect that traps
await multi-threaded or multi-context apps that need to share data.
Also, adding another step to the constructor is not going to help the
already disasterous performance.

I appreciate efforts to make the module as idiot-proof as possible.
However, that is a pipe dream.  By adopting and exposing the full
standard instead of the simpler X3.274 subset, using the module is a
non-trivial exercise and, even for experts, is a complete PITA.  Even a
simple fixed-point application (money, for example) requires dealing
with quantize(), normalize(), rounding modes, signals, etc.  By default,
outputs are not normalized so it is difficult even to recognize what a
zero looks like.  Just getting output without exponential notation is
difficult.  If someone wants to craft another module to wrap around and
candy-coat the Decimal API, I would be all for it.  Just recognize that
the full spec doesn't have a beginner mode -- for better or worse, we've
simulated a hardware FPU.

Lastly, I think it is a mistake to make a change at this point.  The
design of the constructor survived all drafts of the PEP,
comp.lang.python discussion, python-dev discussion, all early
implementations, sandboxing, the Py2.4 alpha/beta, cookbook
contributions, and several months in the field.  I say we document a
recommendation to use Context.create_decimal() and get on with life.



Clueless in Boston



P.S.  With 28 digit default precision, the odds of this coming up in
practice are slim (when was the last time you typed in a floating point
value with more than 28 digits; further, if you had, would it have
ruined your day if your 40 digits were not first rounded to 28 before
being used).  IOW, bug tracker lists hundreds of bigger fish to fry
without having to change a published API (pardon the mixed metaphor).

From tim.peters at gmail.com  Fri May 20 05:55:07 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Thu, 19 May 2005 23:55:07 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <000901c55ce3$819849e0$2db0958d@oemcomputer>
References: <e04bdf3105051906532a3b685d@mail.gmail.com>
	<000901c55ce3$819849e0$2db0958d@oemcomputer>
Message-ID: <1f7befae050519205524f7a087@mail.gmail.com>

Sorry, I simply can't make more time for this.  Shotgun mode:

[Raymond]
> I have no such thoughts but do strongly prefer the current
> design.

How can you strongly prefer it?  You asked me whether I typed floats
with more than 28 significant digits.  Not usually <wink>.  Do you? 
If you don't either, how can you strongly prefer a change that makes
no difference to what you do?

> ...
> The overall design of the module and the spec is to apply
> context to the results of operations, not their inputs.

But string->float is an _operation_ in the spec, as it has been since
1985 in IEEE-754 too.  The float you get is the result of that
operation, and is consistent with normal numeric practice going back
to the first time Fortran grew a distinction between double and single
precision.  There too the common practice was to write all literals as
double-precision, and leave it to the compiler to round off excess
bits if the assignment target was of single precision.  That made it
easy to change working precision via fiddling a single "implicit" (a
kind of type declaration) line.  The same kind of thing would be
pleasantly applicable for decimal too -- if the constructor followed
the rules.

> In particular, the spec recognizes that contexts can change
|> and rather than specifying automatic or implicit context
> application to all existing values, it provides the unary plus
> operation so that such an application is explicit.  The use
> of extra digits in a calculation is not invisible as the
> calculation will signal Rounded and Inexact (if non-zero digits
> are thrown away).

Doesn't change that the standard rigorously specifies how strings are
to be converted to decimal floats, or that our constructor
implementation doesn't do that.

> One of the original motivating examples was "schoolbook"
> arithmetic where the input string precision is incorporated
> into the calculation.

Sorry, doesn't ring a bell to me.  Whose example was this?

> IMO, input truncation/rounding is inconsistent with that
> motivation.

Try keying more digits into your hand calculator than it can hold <0.5 wink>.

> Likewise, input rounding runs contrary to the basic goal of
> eliminating representation error.

It's no surprise that an exact value containing more digits than
current precision gets rounded.  What _is_ surprising is that the
decimal constructor doesn't follow that rule, instead making up its
own rule.  It's an ugly inconsistency at best.

> With respect to integration with the rest of Python (everything
> beyond that spec but needed to work with it), I suspect that
> altering the Decimal constructor is fraught with issues such
> as the string-to-decimal-to-string roundtrip becoming context
> dependent.

Nobody can have a reasonable expectation that string -> float ->
string is an identity for any fixed-precision type across all strings.
 That's just unrealistic.  You can expect string -> float -> string to
be an identity if the string carries no more digits than current
precision.  That's how a bounded type works.  Trying to pretend it's
not bounded in this one case is a conceptual mess.

> I haven't thought it through yet but suspect that it does not
> bode well for repr(), pickling, shelving, etc.

The spirit of the standard is always to deliver the best possible
approximation consistent with current context.  Unpickling and
unshelving should play that game too.  repr() has a special desire for
round-trip fidelity.

> Likewise, I suspect that traps await multi-threaded or multi-
> context apps that need to share data.

Like what?  Thread-local context precision is a reality here, going
far beyond just string->float.

> Also, adding another step to the constructor is not going to
> help the already disasterous performance.

(1) I haven't found it to be a disaster.  (2) Over the long term, the
truly speedy implementations of this standard will be limited to a
fixed set of relatively small precisions (relative to, say, 1000000,
not to 28 <wink>).  In that world it would be unboundedly more
expensive to require the constructor to save every bit of every input:
 rounding string->float is a necessity for speedy operation over the
long term.

> I appreciate efforts to make the module as idiot-proof as
> possible.

That's not my interest here.  My interest is in a consistent,
std-conforming arithmetic, and all fp standards since IEEE-754
recognized that string->float is "an operation" much like every other
fp operation.  Consistency helps by reducing complexity.  Most users
will never bump into this, and experts have a hard enough job without
gratuitous deviations from a well-defined spec.  What's the _use case_
for carrying an unbounded amount of information into a decimal
instance?  It's going to get lost upon the first operation anyway.

> However, that is a pipe dream.  By adopting and exposing the
> full standard instead of the simpler X3.274 subset, using the
> module is a non-trivial exercise and, even for experts, is a
> complete PITA.

Rigorous numeric programming is a difficult art.  That's life.  The
many exacting details in the standard aren't the cause of that,
they're a distillation of decades of numeric experience by bona fide
numeric experts.  These are the tools you need to do a rigorous job --
and most users can ignore them completely, or at worst set precision
once at the start and forget it.  _Most_ of the stuff (by count) in
the standard is for the benefit of expert library authors, facing a
wide variety of externally imposed requirements.

> Even a simple fixed-point application (money, for example)
> requires dealing with quantize(), normalize(), rounding modes,
> signals, etc.

I don't know why you'd characterize a monetary application as
"simple".  To the contrary, they're as demanding as they come.  For
example, requirements for bizarre rounding come with that territory,
and the standard exposes tools to _help_ deal with that.  The standard
didn't invent rounding modes, it recognizes that needing to deal with
them is a fact of life, and that it's much more difficult to do
without any help from the core arithmetic.  So is needing to deal with
many kinds of exceptional conditions, and in different ways depending
on the app -- that's why all that machinery is there.

> By default, outputs are not normalized so it is difficult even
> to recognize what a zero looks like.

You're complaining about a feature there <wink>.  That is, the lack of
normalization is what makes 1.10 the result of 2.21 - 1.11, rather
than 1.1 or 1.100000000000000000000000000.  1.10 is what most people
expect.

> Just getting output without exponential notation is difficult.

That's a gripe I have with the std too.  Its output formats are too
simple-minded and few.  I had the same frustration using REXX. 
Someday the %f/%g/%e format codes should learn how to deal with
decimals, and that would be pleasant enough for me.

> If someone wants to craft another module to wrap around and
> candy-coat the Decimal API, I would be all for it.

For example, Facundo is doing that with a money class, yes?  That's
fine.  The standard tries to support many common arithmetic needs, but
big as it is, it's just a start.

> Just recognize that the full spec doesn't have a beginner
> mode -- for better or worse, we've simulated a hardware FPU.

I haven't seen a HW FPU with unbounded precision, or one that does
decimal arithmetic.  Apart from the limited output modes, I have no
reason to suspect that a beginner will have any particular difficulty
with decimal.  They don't have to know anything about signals and
traps, rounding modes or threads, etc etc -- right out of the box,
except for output fomat it acts very much like a high-end hand
calculator.

> Lastly, I think it is a mistake to make a change at this point.

It's a worse mistake to let a poor decision slide indefinitely -- it
gets harder & harder to change it over time.  Heck, to listen to you,
decimal is so bloody complicated nobody could possibly be using it now
anyway <wink>.

> The design of the constructor survived all drafts of the PEP,
> comp.lang.python discussion, python-dev discussion, all early
> implementations, sandboxing, the Py2.4 alpha/beta, cookbook
> contributions, and several months in the field.

So did every other aspect of Python you dislike now <0.3 wink>.  It
never occurred to me that the implementation _wouldn't_ follow the
spec in its treatment of string->float.  I whined about that when I
discovered it, late in the game.  A new, conforming string->float
method was added then, but for some reason (or no reason) I don't
recall, the constructor wasn't changed.  That was a mistake.

>  I say we document a recommendation to use
> Context.create_decimal() and get on with life.

....

> P.S.  With 28 digit default precision, the odds of this coming
> up in practice are slim (when was the last time you typed in a
> floating point value with more than 28 digits; further, if you had,
> would it have ruined your day if your 40 digits were not first
> rounded to 28 before being used).

Depends on the app, of course.  More interesting is the day when
someone ports an app from a conforming implementation of the standard,
sets precision to (say) 8, and gets different results in Python
despite that the app stuck solely to standard operations.  Of course
that can be a genuine disaster for a monetary application -- extending
standards in non-obvious ways imposes many costs of its own, but they
fall on users, and aren't apparent at first.  I want to treat the
decimal module as if the standard it purports to implement will
succeed.

From greg.ewing at canterbury.ac.nz  Fri May 20 06:28:20 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 20 May 2005 16:28:20 +1200
Subject: [Python-Dev] [Python-checkins] python/nondist/peps pep-0343.txt, 1.11,
 1.12
In-Reply-To: <1f7befae0505172128f1f9daa@mail.gmail.com>
References: <E1DYALw-0001Rd-RB@sc8-pr-cvs1.sourceforge.net>
	<000101c55b5c$56f14b20$ab29a044@oemcomputer>
	<1f7befae0505172128f1f9daa@mail.gmail.com>
Message-ID: <428D6764.8060101@canterbury.ac.nz>

Tim Peters wrote:
> [Raymond Hettinger]

>>>>>from decimal import getcontext, Decimal as D
>>>>>getcontext().prec = 3
>>>>>D('3.104') + D('2.104')
>>
>>Decimal("5.21")
>>
>>>>>D('3.104') + D('0.000') + D('2.104')
>>
>>Decimal("5.20")
> 
> the results differ here because D(whatever)
> ignores context settings; having a common operation ignore context is
> ugly and error-prone).

I don't see it's because of that. Even if D(whatever)
didn't ignore the context settings, you'd get the same
oddity if the numbers came from somewhere else with a
different precision.

I'm very uncomfortable about the whole idea of a
context-dependent precision. It just seems to be
asking for trouble.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From tim.peters at gmail.com  Fri May 20 06:42:07 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Fri, 20 May 2005 00:42:07 -0400
Subject: [Python-Dev] [Python-checkins] python/nondist/peps pep-0343.txt,
	1.11, 1.12
In-Reply-To: <428D6764.8060101@canterbury.ac.nz>
References: <E1DYALw-0001Rd-RB@sc8-pr-cvs1.sourceforge.net>
	<000101c55b5c$56f14b20$ab29a044@oemcomputer>
	<1f7befae0505172128f1f9daa@mail.gmail.com>
	<428D6764.8060101@canterbury.ac.nz>
Message-ID: <1f7befae0505192142a075275@mail.gmail.com>

[Greg Ewing]
> I don't see it's because of that. Even if D(whatever)
> didn't ignore the context settings, you'd get the same
> oddity if the numbers came from somewhere else with a
> different precision.

Most users don't change context precision, and in that case there is
no operation defined in the standard that can _create_ a decimal "with
different precision".  Python's Decimal constructor, however, can
(Python's Decimal constructor performs an operation that's not in the
standard -- it's a Python-unique extension to the standard).

> I'm very uncomfortable about the whole idea of a
> context-dependent precision. It just seems to be
> asking for trouble.

If you're running on a Pentium box, you're using context-dependent
precision a few million times per second.  Most users will be as
blissfully unaware of decimal's context precsion as you are of the
Pentium FPU's context precision.  Most features in fp standards are
there for the benefit of experts.  You're not required to change
context; those who need such features need them desperately, and don't
care whether you think they should <wink>.

An alternative is a God-awful API that passes a context object
explicitly to every operation.  You can, e.g., kiss infix "+" goodbye
then.  Some implementations of the standard do exactly that.

You might want to read the standard before getting carried off by gut reactions:

    http://www2.hursley.ibm.com/decimal/

From gvanrossum at gmail.com  Fri May 20 06:53:46 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Thu, 19 May 2005 21:53:46 -0700
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <1f7befae050519205524f7a087@mail.gmail.com>
References: <e04bdf3105051906532a3b685d@mail.gmail.com>
	<000901c55ce3$819849e0$2db0958d@oemcomputer>
	<1f7befae050519205524f7a087@mail.gmail.com>
Message-ID: <ca471dc2050519215323e2c315@mail.gmail.com>

I know I should stay out of here, but isn't Decimal() with a string
literal as argument a rare case (except in examples)? It's like
float() with a string argument -- while you *can* write float("1.01"),
nobody does that. What people do all the time is parse a number out of
some larger context into a string, and then convert the string to a
float by passing it to float(). I assume that most uses of the
Decimal()  constructor will be similar. In that case, it  makes total
sense to me that the context's precision should be used, and if the
parsed string contains an insane number of digits, it will be rounded.

I guess the counter-argument is that because we don't have Decimal
literals, Decimal("12345") is used as a pseudo-literal, so it actually
occurs more frequently than float("12345"). Sure. But the same
argument applies: if I write a floating point literal in Python (or C,
or Java, or any other language) with an insane number of digits, it
will be rounded.

So, together with the 28-digit default precision, I'm fine with
changing the constructor to use the context by default. If you want
all the precision given in the string, even if it's a million digits,
set the precision to the length of the string before you start; that's
a decent upper bound. :-)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From tim.peters at gmail.com  Fri May 20 07:15:02 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Fri, 20 May 2005 01:15:02 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <ca471dc2050519215323e2c315@mail.gmail.com>
References: <e04bdf3105051906532a3b685d@mail.gmail.com>
	<000901c55ce3$819849e0$2db0958d@oemcomputer>
	<1f7befae050519205524f7a087@mail.gmail.com>
	<ca471dc2050519215323e2c315@mail.gmail.com>
Message-ID: <1f7befae050519221531abbb54@mail.gmail.com>

[Guido van Rossum]
> I know I should stay out of here,

Hey, it's still your language <wink>.

> but isn't Decimal() with a string literal as argument a rare
> case (except in examples)?  It's like float() with a string
> argument -- while you *can* write float("1.01"), nobody does
> that. What people do all the time is parse a number out
> of some larger context into a string, and then convert the
> string to a float by passing it to float(). I assume that most
> uses of the Decimal()  constructor will be similar.

I think that's right.  For example, currency exchange rates, and stock
prices, are generally transmitted as decimal strings now, and those
will get fed to a Decimal constructor.

OTOH, in scientific computing it's common to specify literals to very
high precision (like 40 decimal digits).  Things like pi, e, sqrt(2),
tables of canned numeric quadrature points, canned coefficients for
polynomial approximations of special functions, etc.  The authors
don't expect "to get" all they specify, what they expect is that
various compilers on various platforms will give them as much
precision as they're capable of using efficiently.  Rounding is
expected then, and indeed pragmatically necessary (carrying precision
beyond that natively supported comes with high runtime costs -- and
that can be equally true of Decimal literals carried with digits
beyond context precision:  the standard requires that results be
computed "as if to infinite precision then rounded once" using _all_
digits in the inputs).

> In that case, it  makes total sense to me that the context's
> precision should be used, and if the parsed string contains
> an insane number of digits, it will be rounded.

That's the IBM standard's intent (and mandatory in its string->float operation).

> I guess the counter-argument is that because we don't have
> Decimal literals, Decimal("12345") is used as a pseudo-literal,
> so it actually occurs more frequently than float("12345"). Sure.
> But the same argument applies: if I write a floating point literal
> in Python (or C, or Java, or any other language) with an insane
> number of digits, it will be rounded.

Or segfault <0.9 wink>.

> So, together with the 28-digit default precision, I'm fine with
> changing the constructor to use the context by default. If you
> want all the precision given in the string, even if it's a million
> digits, set the precision to the length of the string before you
> start; that's a decent upper bound. :-)

That is indeed the intended way to do it.  Note that this also applies
to integers passed to a Decimal constructor.

Maybe it's time to talk about an unbounded rational type again <ducks>.

From python at rcn.com  Fri May 20 08:21:15 2005
From: python at rcn.com (Raymond Hettinger)
Date: Fri, 20 May 2005 02:21:15 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <1f7befae050519205524f7a087@mail.gmail.com>
Message-ID: <000a01c55d04$20eecb20$2db0958d@oemcomputer>

I sense a religious fervor about this so go ahead and do whatever you
want.

Please register my -1 for the following reasons:

a.) It re-introduces representation error into a module that worked so
hard to overcome that very problem.  The PEP explicitly promises that a
transformation from a literal involves no loss of information.
Likewise, it promises that "context just affects operations' results".

b.) It is inconsistent with the idea of having the input specify its own
precision:  http://www2.hursley.ibm.com/decimal/decifaq1.html#tzeros

c.) It is both untimely and unnecessary.  The module is functioning
according to its tests, the specification test suite, and the PEP.
Anthony should put his foot down as this is NOT a bugfix, it is a change
in concept.  The Context.create_decimal() method already provides a
standard conforming implementation of the to-number conversion.
http://www.python.org/peps/pep-0327.html#creating-from-context .

d.) I believe it will create more problems than it would solve.  If
needed, I can waste an afternoon coming up with examples.  Likewise, I
think it will make the module more difficult to use (esp. when
experimenting with the effect of results of changing precision).

e.) It does not eliminate the need to use the plus operation to force
rounding/truncation when switching precision.

f.) To be consistent, one would need to force all operation inputs to
have the context applied before their use.  The standard specifically
does not do this and allows for operation inputs to be of a different
precision than the current context (that is the reason for the plus
operation).

g.) It steers people in the wrong direction.  Increasing precision is
generally preferable to rounding or truncating explicit inputs.  I
included two Knuth examples in the docs to show the benefits of bumping
up precision when needed. 

h.) It complicates the heck out of storage, retrieval, and input.
Currently, decimal objects have a meaning independent of context.  With
the proposed change, the meaning becomes context dependent.

i.) After having been explicitly promised by the PEP, discussed on the
newsgroup and python-dev, and released to the public, a change of this
magnitude warrants a newsgroup announcement and a comment period.



A use case:
-----------
The first use case that comes to mind is in the math.toRadians()
function.  When originally posted, there was an objection that the
constant degToRad was imprecise to the last bit because it was expressed
as the ratio of two literals that compiler would have rounded, resulting
in a double rounding.

Link to rationale for the spec:
-------------------------------
http://www2.hursley.ibm.com/decimal/IEEE-cowlishaw-arith16.pdf
See the intro to section 4 which says:  The digits in decimal are not
significands; rather, the numbers are exact.  The arithmetic on those
numbers is also exact unless rounding to a given precision is specified.

Link to the discussion relating decimal design rationale to schoolbook
math
------------------------------------------------------------------------
---
I can't find this link.  If someone remembers, please post it.



Okay, I've said my piece.
Do what you will.



Raymond

From python at rcn.com  Fri May 20 09:39:16 2005
From: python at rcn.com (Raymond Hettinger)
Date: Fri, 20 May 2005 03:39:16 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <000a01c55d04$20eecb20$2db0958d@oemcomputer>
Message-ID: <000401c55d0f$06e9b7c0$2db0958d@oemcomputer>

Addenda:

j.) The same rules would need to apply to all forms of the Decimal
contructor, so Decimal(someint) would also need to truncate/round if it
has more than precision digits -- likewise with Decimal(fromtuple) and
Decimal(fromdecimal).  All are problematic.  Integer conversions are
expected to be exact but may not be after the change.  Conversion from
another decimal should be idempotent but implicit rounding/truncation
will break that.  The fromtuple/totuple round-trip can get broken.  You
generally specify a tuple when you know exactly what you want.  

k.) The biggest client of all these methods is the Decimal module
itself.  Throughout the implementation, the code calls the Decimal
constructor to create intermediate values.  Every one of those calls
would need to be changed to specify a context.  Some of those cases are
not trivially changed (for instance, the hash method doesn't have a
context but it needs to check to see if a decimal value is exactly an
integer so it can hash to that value).  Likewise, how do you use a
decimal value for a dictionary key when the equality check is context
dependent (change precision and lose the ability to reference an entry)?


Be careful with this proposed change.  It is a can of worms.
Better yet, don't do it.  We already have a context aware
constructor method if that is what you really want.



Raymond

From python-dev at zesty.ca  Fri May 20 10:24:21 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Fri, 20 May 2005 03:24:21 -0500 (CDT)
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <ca471dc20505161611497a25b7@mail.gmail.com>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<ca471dc20505161611497a25b7@mail.gmail.com>
Message-ID: <Pine.LNX.4.58.0505191948040.4932@server1.LFW.org>

On Mon, 16 May 2005, Guido van Rossum wrote:
> Here's a bunch of commentary:

Thanks.  Sorry it's taken me a couple of days to get back to this.
I think i'm caught up on the mail now.

> You're not giving enough credit to Java, which has the "cause" part
> nailed IMO.

You're right.  I missed that.

In my initial research i was only looking for implicit chaining, and
the only support for that i could find was the proposed @@ in Perl 6.
Later i went back and added explicit chaining, realizing that this
was what some of the interested parties originally wanted (and that
C# had it too).

> In particular, please read and understand the
> comment in ceval.c starting with this line:
>
>     /* Implementation notes for set_exc_info() and reset_exc_info():

Got it.

> There's too much text devoted early on to examples.

Okay.  In the next revision of the PEP, i'll rearrange it.

> I don't think the PEP adequately predicts what should happen in this
> example:
>
>     def foo():
> 	try:
> 	    1/0  # raises ZeroDivisionError
> 	except:
> 	    bar()
> 	    raise sys.does_not_exist  # raises AttributeError
>
>     def bar():
> 	try:
> 	    1+""  # raises TypeError
> 	except TypeError:
> 	    pass
>
> Intuitively, the AttributeError should have the ZeroDivisionError as
> its __context__, but I think the clearing of the thread's exception
> context when the except clause in bar() is left will drop the
> exception context.

That's true.  I agree that the semantics in the PEP (v1.7) are broken.

> Also, in that same example, according to your specs, the TypeError
> raised by bar() has the ZeroDivisionError raised in foo() as its
> context.  Do we really want this?

I don't think it's absolutely necessary, though it doesn't seem to
hurt.  We agree that if the TypeError makes it up to foo's frame,
it should have the ZeroDivisionError as its __context__, right?

If so, do i understand correctly that you want the __context__ to
depend on where the exception is caught as well as where it is raised?

In your thinking, is this mainly a performance or a cleanliness issue?

Basically i was looking for the simplest description that would
guarantee ending up with all the relevant tracebacks reported in
chronological order.  I thought it would be more complicated if we
had to keep modifying the traceback on the way up, but now that
i've re-learned how tracebacks are constructed, it's moot -- we're
already extending the traceback on the way through each frame.

I have a proposal for the implicit chaining semantics that i'll post
in a separate message so it isn't buried in the middle of this one.

> When chaining exceptions, I think it should be an error if the cause
> is not an exception instance (or None).

That's reasonable.

> Do we really need new syntax to set __cause__?  Java does this without
> syntax by having a standard API initCause() (as well as constructors
> taking a cause as argument; I understand why you don't want to rely on
> that -- neither does Java).  That seems more general because it can be
> used outside the context of a raise statement.

I went back and forth on this.  An earlier version of the PEP actually
proposes a 'setcause' method.  I eventually settled on a few reasons
for the "raise ... from" syntax:

    1.  (major) No possibility of method override; no susceptibility
        to manipulation of __dict__ or __getattr__; no possibility of
        another exception happening while trying to set the cause.

    2.  (moderate) There is a clear, distinct idiom for exception
        replacement requiring that the cause and effect must be
        identified together at the point of raising.

    3.  (minor) No method namespace pollution.

    4.  (minor) Less typing, less punctuation.

The main thing is that handling exceptions is a delicate matter, so it's
nice to have guarantees that the things you're doing aren't going to
suddenly raise more exceptions.

> Why insert a blank line between chained tracebacks?

Just to make them easier to read.  The last line of each traceback
is important because it identifies the exception type, and that will
be a lot easier to find if it isn't buried in an uninterrupted stream
of lines.

> I might want to add an extra line at the very end (and
> perhaps at each chaining point) warning the user that the exception
> has a chained counterpart that was printed earlier.

How about if the line says how many exceptions there were?  Like:

    [This is the last of 5 exceptions; see above for the others.]

> Why should the C level APIs not automatically set __context__?  (There
> may be an obvious reason but it doesn't hurt stating it.)

Because:

    (a) you indicated some discomfort with the idea, perhaps because
        it would make the interpreter do unnecessary work;
    (b) no one seems to be asking for it;
    (c) it seems potentially complicated.

However, if we go for the semantics you want, PyErr_Set* wouldn't set
__context__ at the moment of raising anyway.  If __context__ is set
during unwinding, then i expect it would get set on exceptions raised
from C too, since the interpreter wouldn't know the difference.

> I was surprised to learn that yield clears the exception state; I
> wonder if this isn't a bug in the generator implementation?  IMO
> better semantics would be for the exception state to survive across
> yield.

I agree -- i just didn't want to tackle that issue in this PEP.
It could be considered a separate enhancement/bugfix.

> You should at least mention what should happen to string exceptions,
> even if (as I presume) the only sane approach is not to support this
> for string exceptions (a string exception may be the end of the chain,
> but it cannot have a chained exception itself).

Yes, exactly.

> I don't like the example (in "Open Issues") of applications wrapping
> arbitrary exceptions in ApplicationError.  I consider this bad style,
> even if the chaining makes it not quite as bad as it used to be.

Isn't it a reasonable possibility that, as part of its contract, a
library will want to guarantee that it only raises exceptions of
certain types?

> I don't see the need for "except *, exc" -- assuming all exceptions
> derive from a single base class, we can just write the name of that
> base class.

If we get there, yes.  But at the moment, i don't believe there's any
way to catch an arbitrary string exception or an exception of a
non-Exception instance other than "except:".

> I don't like having sys.exception; instead, the only way to access the
> "current" exception ought to be to use an except clause with a
> variable.  (sys.last_exception is fine.)

That would be nice, again once we have all exceptions derive from
Exception.  It seems to me we'd have to do these changes in this order:

    (a) ban string exceptions
    (b) require all exceptions to derive from Exception
    (c) ban bare "except:"
    (d) eliminate sys.exc_*

Or do them all at once in Python 3000.  (Well, i guess all that is just
repeating what Brett has talked about putting in his exception PEP.)

> I like the idea of taking all APIs that currently require a (type,
> value, traceback) triple to *also* accept a single exception instance.

Okay.

> You should probably reference the proposal (pending a PEP; I think
> Brett is working on it?) that all exceptions should eventually derive
> from a common base class (a la Throwable in Java).

Okay.

> I hope that this can be accepted together with the son-of-PEP-343 (PEP
> 343 plus generator exception injection and finalization) so __exit__
> can take a single exception argument from the start.  (But what should
> it receive if a string exception is being caught?  A triple perhaps?)

Dare i suggest... a string subclass with a __traceback__ attribute?

A string subclass (that also subclasses Exception) might be a migration
path to eliminating string exceptions.


-- ?!ng

From python-dev at zesty.ca  Fri May 20 10:30:17 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Fri, 20 May 2005 03:30:17 -0500 (CDT)
Subject: [Python-Dev] PEP 344: Implicit Chaining Semantics
In-Reply-To: <ca471dc20505171102138355d8@mail.gmail.com>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<ca471dc20505161611497a25b7@mail.gmail.com>
	<ca471dc20505170741204b8dfe@mail.gmail.com>
	<94cc3abfd1b8fd2a83e6fa84b9452508@xs4all.nl>
	<ca471dc20505171102138355d8@mail.gmail.com>
Message-ID: <Pine.LNX.4.58.0505200324440.4932@server1.LFW.org>

Guido van Rossum wrote:
>     try:
>         BLOCK
>     except EXCEPTION, VAR:
>         HANDLER
>
> I'd like to see this translated into
>
>     try:
>         BLOCK
>     except EXCEPTION, VAR:
>         __context = VAR
>         try:
>             HANDLER
>         except Exception, __error:
>             __error.__context__ = __context
>             raise

Eric Nieuwland wrote:
> If I interpret the above translation correctly, then:
>      try:
>          BLOCK1
>      except EXCEPTION1, VAR1:
>          try:
>              BLOCK2
>          except EXCEPTION2, VAR2:
>              HANDLER
>
> with exceptions occuring in BLOCK1, BLOCK2 and HANDLER would result in
> HANDLER's exception with __context__ set to BLOCK1's exception and
> BLOCK2's exception would be lost.

Guido van Rossum wrote:
> But that could easily be fixed by appending the context to the end of
> the chain, right?

That would fix this case, but i have a hard time proving to myself that
the result would include all the tracebacks in chronological order.

Can be back up and see if we can agree on a specification at a semantic
level first?  I've been trying to narrow down exactly what you seem to
intuitively want -- how do you like the following:

   Definition of "context": An exception-raise event X is the context
   for an exception-raise event Y if and only if

   1.  Y occurs after X in the same thread.

   2.  If an 'except' clause catches X, Y occurs before exit from this
       clause and is not caught before exit from this clause.

(I refer to "exception-raise events" to avoid any ambiguity about the
same exception object being raised twice.  Each raise-event corresponds
to at most one catch.)


-- ?!ng

From python-dev at zesty.ca  Fri May 20 10:31:02 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Fri, 20 May 2005 03:31:02 -0500 (CDT)
Subject: [Python-Dev] PEP 344: Explicit vs. Implicit Chaining
Message-ID: <Pine.LNX.4.58.0505200330240.4932@server1.LFW.org>

Guido van Rossum wrote:
> Do we really need both __context__ and __cause__?

Well, it depends whose needs we're trying to meet.

If we want to satisfy those who have been asking for chaining
of unexpected secondary exceptions, then we have to provide that
on some attribute.

If we also want to provide the facility that Java and C# provide
with initCause/InnerException, then we need a separate attribute
devoted to explicit chaining.  The Java and C# documentation is
clear that the cause/inner exception is to be set only on an
exception that is "caused" or a "direct result" of the primary
exception, which i've taken as a sign that this is an important
distinction.

I wanted to give a shot at making both camps happy.

If the two were unified, we'd still be better off than we are
now, but we should be aware that we would not be providing the
functionality that Java and C# provide.


-- ?!ng

From python-dev at zesty.ca  Fri May 20 10:42:32 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Fri, 20 May 2005 03:42:32 -0500 (CDT)
Subject: [Python-Dev] PEP 344: Implicit Chaining Semantics
In-Reply-To: <Pine.LNX.4.58.0505200324440.4932@server1.LFW.org>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<ca471dc20505161611497a25b7@mail.gmail.com>
	<ca471dc20505170741204b8dfe@mail.gmail.com>
	<94cc3abfd1b8fd2a83e6fa84b9452508@xs4all.nl>
	<ca471dc20505171102138355d8@mail.gmail.com>
	<Pine.LNX.4.58.0505200324440.4932@server1.LFW.org>
Message-ID: <Pine.LNX.4.58.0505200332460.4932@server1.LFW.org>

On Fri, 20 May 2005, Ka-Ping Yee wrote:
> Can be back up and see if we can agree on a specification at a semantic
> level first?  I've been trying to narrow down exactly what you seem to
> intuitively want -- how do you like the following:
>
>    Definition of "context": An exception-raise event X is the context
>    for an exception-raise event Y if and only if
>
>    1.  Y occurs after X in the same thread.
>
>    2.  If an 'except' clause catches X, Y occurs before exit from this
>        clause and is not caught before exit from this clause.

Dang.  I forgot to deal with what 'exit' means in the face of 'yield'.

   2.  If an 'except' clause catches X, Y occurs before "exit" from or
       while "inside" this clause, and is not caught before "exit" from
       or while "inside" this clause.

       "Enter" means entering at the beginning or resuming after yield.
       "Exit" means execution of return or yield, jumping out of a block
       via break or continue, or reaching the end of the block.

       "Inside" means entered and not yet exited.


-- ?!ng

From ncoghlan at gmail.com  Fri May 20 11:22:20 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 20 May 2005 19:22:20 +1000
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <000401c55d0f$06e9b7c0$2db0958d@oemcomputer>
References: <000401c55d0f$06e9b7c0$2db0958d@oemcomputer>
Message-ID: <428DAC4C.3010505@gmail.com>

Raymond Hettinger wrote:
> Be careful with this proposed change.  It is a can of worms.
> Better yet, don't do it.  We already have a context aware
> constructor method if that is what you really want.

And don't forgot that 'context-aware-construction' can also be written:

   val = +Decimal(string_repr)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From ncoghlan at gmail.com  Fri May 20 11:39:51 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 20 May 2005 19:39:51 +1000
Subject: [Python-Dev] python-dev Summary for 2005-05-01 through
 2005-05-15 [draft]
In-Reply-To: <9613db60050519061018ae0b8@mail.gmail.com>
References: <9613db60050519061018ae0b8@mail.gmail.com>
Message-ID: <428DB067.8040603@gmail.com>

Tim Lesher wrote:
> Here's the first draft of the python-dev summary for the first half of
> May.  Please send any corrections or suggestions to the summarizers (CC'ed).

Nice work again, folks.

This summary has convinced me that the right thing to do with PEP 3XX is to fix 
the formatting, submit it again and immediately withdraw it. I think that's 
worth doing due to the PEP draft's role in the PEP 340 discussion, but it looks 
like the worthwhile ideas it contains are going to make their way into other 
PEP's (like 343 and the updated/new generator API PEP), so I don't see much 
value in keeping PEP 3XX alive as a competitor to PEP 343.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From walter at livinglogic.de  Fri May 20 11:46:30 2005
From: walter at livinglogic.de (=?ISO-8859-1?Q?Walter_D=F6rwald?=)
Date: Fri, 20 May 2005 11:46:30 +0200
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <Pine.LNX.4.58.0505191948040.4932@server1.LFW.org>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>	<ca471dc20505161611497a25b7@mail.gmail.com>
	<Pine.LNX.4.58.0505191948040.4932@server1.LFW.org>
Message-ID: <428DB1F6.1060602@livinglogic.de>

Ka-Ping Yee wrote:

> [...]
>     (a) ban string exceptions
>     (b) require all exceptions to derive from Exception
>     (c) ban bare "except:"
>     (d) eliminate sys.exc_*

I think somewhere in this list should be:

       (?) Remove string exceptions from the Python stdlib

and perhaps:

       (?) Make Exception a new style class

Bye,
    Walter D?rwald

From mwh at python.net  Fri May 20 14:05:13 2005
From: mwh at python.net (Michael Hudson)
Date: Fri, 20 May 2005 13:05:13 +0100
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <428DB1F6.1060602@livinglogic.de> (
	=?iso-8859-1?q?Walter_D=F6rwald's_message_of?= "Fri,
	20 May 2005 11:46:30 +0200")
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<ca471dc20505161611497a25b7@mail.gmail.com>
	<Pine.LNX.4.58.0505191948040.4932@server1.LFW.org>
	<428DB1F6.1060602@livinglogic.de>
Message-ID: <2m3bsi2jd2.fsf@starship.python.net>

Walter D?rwald <walter at livinglogic.de> writes:

> Ka-Ping Yee wrote:
>
>> [...]
>>     (a) ban string exceptions
>>     (b) require all exceptions to derive from Exception
>>     (c) ban bare "except:"
>>     (d) eliminate sys.exc_*
>
> I think somewhere in this list should be:
>
>        (?) Remove string exceptions from the Python stdlib

I think this is done, more or less.  There's one in test_descr, I
think (probably testing that you can't raise str-subclasses but can
raise strs).

> and perhaps:
>
>        (?) Make Exception a new style class

I have a patch for this on SF, of course.

Cheers,
mwh

-- 
112. Computer Science is embarrassed by the computer.
  -- Alan Perlis, http://www.cs.yale.edu/homes/perlis-alan/quotes.html

From mcherm at mcherm.com  Fri May 20 14:24:53 2005
From: mcherm at mcherm.com (Michael Chermside)
Date: Fri, 20 May 2005 05:24:53 -0700
Subject: [Python-Dev] Adventures with Decimal
Message-ID: <20050520052453.7dc5fr5q27ggwgg4@login.werra.lunarpages.com>

[Tim and Raymond are slugging it out about whether Decimal constructors
 should respect context precision]

Tim, I find Raymond's arguments to be much more persuasive. (And that's
even BEFORE I read his 11-point missive.) I understood the concept that
*operations* are contex-dependent, but decimal *objects* are not, and
thus it made sense to me that *constructors* were not context-dependent.

On the other hand, I am NOT a floating-point expert. Can you educate
me some? What is an example of a case where users would get "wrong"
results because constructors failed to respect context precision?

(By the way... even if other constructors begin to respect context
precision, the constructor from tuple should NOT -- it exists to provide
low-level access to the implementation. I'll express no opinion on the
constructor from Decimal, because I don't understand the issues.)

-- Michael Chermside


From walter at livinglogic.de  Fri May 20 17:06:07 2005
From: walter at livinglogic.de (=?ISO-8859-1?Q?Walter_D=F6rwald?=)
Date: Fri, 20 May 2005 17:06:07 +0200
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <2m3bsi2jd2.fsf@starship.python.net>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>	<ca471dc20505161611497a25b7@mail.gmail.com>	<Pine.LNX.4.58.0505191948040.4932@server1.LFW.org>	<428DB1F6.1060602@livinglogic.de>
	<2m3bsi2jd2.fsf@starship.python.net>
Message-ID: <428DFCDF.40508@livinglogic.de>

Michael Hudson wrote:

> Walter D?rwald <walter at livinglogic.de> writes:
> 
>>Ka-Ping Yee wrote:
>>
>>>[...]
>>>    (a) ban string exceptions
>>>    (b) require all exceptions to derive from Exception
>>>    (c) ban bare "except:"
>>>    (d) eliminate sys.exc_*
>>
>>I think somewhere in this list should be:
>>
>>       (?) Remove string exceptions from the Python stdlib
> 
> I think this is done, more or less.  There's one in test_descr, I
> think (probably testing that you can't raise str-subclasses but can
> raise strs).

There are a few appearances in docstrings, and the Demo, Mac/Tools and 
Tools directories.

Those should be rather simple to fix. The only problem might be if there 
is code that catches these exceptions.

>>and perhaps:
>>
>>       (?) Make Exception a new style class
> 
> I have a patch for this on SF, of course.

http://www.python.org/sf/1104669 it seems.

Bye,
    Walter D?rwald

From tim.peters at gmail.com  Fri May 20 18:19:41 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Fri, 20 May 2005 12:19:41 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <20050520052453.7dc5fr5q27ggwgg4@login.werra.lunarpages.com>
References: <20050520052453.7dc5fr5q27ggwgg4@login.werra.lunarpages.com>
Message-ID: <1f7befae050520091978d2a891@mail.gmail.com>

[Michael Chermside]
> Tim, I find Raymond's arguments to be much more persuasive.
> (And that's even BEFORE I read his 11-point missive.) I
> understood the concept that *operations* are context-
> dependent, but decimal *objects* are not, and thus it made
> sense to me that *constructors* were not context-dependent.
>
> On the other hand, I am NOT a floating-point expert. Can you
> educate me some?

Sorry, I can't make more time for this now.  The short course is that
a module purporting to implement an external standard should not
deviate from that standard without very good reasons, and should make
an effort to "hide" whatever deviations it thinks it needs to indulge
(e.g., make them harder to spell).  This standard provides 100%
portable (across HW, across OSes, across programming languages)
decimal arithmetic, but of course that's only across
standard-conforming implementations.

That the decimal constructor here deviates from the standard appears
to be just an historical accident (despite Raymond's current
indefatigable rationalizations <wink>).  Other important
implementations of the standard didn't make this mistake; for example,
Java's BigDecimal|(java.lang.String) constructor follows the rules
here:

    http://www2.hursley.ibm.com/decimalj/deccons.html

Does it really need to be argued interminably that deviating from a
standard is a Big Deal?  Users pay for that eventually, not
implementors.  Even if a standard "is wrong" (and leaving aside that I
believe this standard asks for the right behavior here), users benefit
from cross-implementation predictability a lot more than they can
benefit from a specific implementation's non-standard idiosyncracies.

From gvanrossum at gmail.com  Fri May 20 18:37:48 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 20 May 2005 09:37:48 -0700
Subject: [Python-Dev] PEP 344: Explicit vs. Implicit Chaining
In-Reply-To: <Pine.LNX.4.58.0505200330240.4932@server1.LFW.org>
References: <Pine.LNX.4.58.0505200330240.4932@server1.LFW.org>
Message-ID: <ca471dc205052009371046ad2b@mail.gmail.com>

[Guido van Rossum]
> > Do we really need both __context__ and __cause__?

[Ka-Ping Yee]
> Well, it depends whose needs we're trying to meet.
> 
> If we want to satisfy those who have been asking for chaining
> of unexpected secondary exceptions, then we have to provide that
> on some attribute.
> 
> If we also want to provide the facility that Java and C# provide
> with initCause/InnerException, then we need a separate attribute
> devoted to explicit chaining.  The Java and C# documentation is
> clear that the cause/inner exception is to be set only on an
> exception that is "caused" or a "direct result" of the primary
> exception, which i've taken as a sign that this is an important
> distinction.
> 
> I wanted to give a shot at making both camps happy.
> 
> If the two were unified, we'd still be better off than we are
> now, but we should be aware that we would not be providing the
> functionality that Java and C# provide.

But what difference does it make in practice? In first approximation,
the only time the context is interesting is when a traceback is
printed. Since you propose to print __context__ when __cause__ isn't
set, from the POV of the user reading the traceback the effect is the
same as if there was only one link (let's call it __cause__) and the
APIs for setting it simply override the default.

(PS I'm still thinking about the equivalence of the chaining
algorithms; I've got a proof sketch in my head but getting the details
in email is taking time.)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From tim.peters at gmail.com  Fri May 20 18:46:12 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Fri, 20 May 2005 12:46:12 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <1f7befae050520091978d2a891@mail.gmail.com>
References: <20050520052453.7dc5fr5q27ggwgg4@login.werra.lunarpages.com>
	<1f7befae050520091978d2a891@mail.gmail.com>
Message-ID: <1f7befae0505200946559e4b84@mail.gmail.com>

[Tim Peters]
> ...
> Other important implementations of the standard didn't
> make this mistake; for example, Java's BigDecimal
> (java.lang.String) constructor follows the rules here:
>
>    http://www2.hursley.ibm.com/decimalj/deccons.html

Hmm -- or maybe it doesn't!  The text says:

    The BigDecimal constructed from the String is in a
    standard form, as though the add method had been
    used to add zero to the number with unlimited
    precision.[1]

and I read "add zero" as "applies context".  But then it says
"unlmited precision".  I'm not at all sure what it means now.

From gvanrossum at gmail.com  Fri May 20 19:06:09 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 20 May 2005 10:06:09 -0700
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <1f7befae0505200946559e4b84@mail.gmail.com>
References: <20050520052453.7dc5fr5q27ggwgg4@login.werra.lunarpages.com>
	<1f7befae050520091978d2a891@mail.gmail.com>
	<1f7befae0505200946559e4b84@mail.gmail.com>
Message-ID: <ca471dc205052010062783f7fd@mail.gmail.com>

Maybe they just meant it as an explanation of "standard form",
clarifying that -0 is turned into +0? (That's what adding 0 does,
right?)

On 5/20/05, Tim Peters <tim.peters at gmail.com> wrote:
> [Tim Peters]
> > ...
> > Other important implementations of the standard didn't
> > make this mistake; for example, Java's BigDecimal
> > (java.lang.String) constructor follows the rules here:
> >
> >    http://www2.hursley.ibm.com/decimalj/deccons.html
> 
> Hmm -- or maybe it doesn't!  The text says:
> 
>     The BigDecimal constructed from the String is in a
>     standard form, as though the add method had been
>     used to add zero to the number with unlimited
>     precision.[1]
> 
> and I read "add zero" as "applies context".  But then it says
> "unlmited precision".  I'm not at all sure what it means now.
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: http://mail.python.org/mailman/options/python-dev/guido%40python.org
> 


-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From mcherm at mcherm.com  Fri May 20 19:34:32 2005
From: mcherm at mcherm.com (Michael Chermside)
Date: Fri, 20 May 2005 10:34:32 -0700
Subject: [Python-Dev] Adventures with Decimal
Message-ID: <20050520103432.um95cugr4zhcg44k@login.werra.lunarpages.com>

Tim Peters writes:
> Other important
> implementations of the standard didn't make this mistake; for example,
> Java's BigDecimal|(java.lang.String) constructor follows the rules
> here:
           [...]
> Hmm -- or maybe it doesn't!  The text says:
           [...]


Here is the actual behavior:

  Jython 2.1 on java1.5.0_03 (JIT: null)
  Type "copyright", "credits" or "license" for more information.
  >>> import java.math.BigDecimal as BigDecimal
  >>> import java.math.MathContext as MathContext
  >>> BigDecimal("0.33333333333333333333333333333333333")
  0.33333333333333333333333333333333333
  >>> BigDecimal("0.33333333333333333333333333333333333", MathContext.DECIMAL32)
  0.3333333
  >>>

In other words, Java's behavior is much closer to the current behavior
of Python, at least in terms of features that are user-visible. The
default behavior in Java is to have infinite precision unless a context
is supplied that says otherwise. So the constructor that takes a string
converts it faithfully, while the constructor that takes a context
obeys the context.

One could argue that they are "following the rules" appropriately and it
just happens that their default context has infinite precision. But from
the point of view of the typical, non-floating-point-aware user, Java's
constructor gives "all the digits you told it to", and so does Python's
current string constructor. Using "+decimal.Decimal(s)" today "rounds off
the constant" (in the nieve user's viewpoint).

Of course, the user is going to be surprised in the NEXT step since the
Python *operations* respect context while the Java ones use infinite
precision for +, -, and * (and require you to specify the behavior for /).

(PS: No, I don't think we should design decimal.Decimal to match the
behavior of Java... but I don't think that the Java example really helps
make your point.)

Elsewhere, Tim writes:
> Sorry, I can't make more time for this now.

I understand!

> The short course is that
> a module purporting to implement an external standard should not
> deviate from that standard without very good reasons

Yes, but should we think of the constructor-from-string as an
implementation-specific means of creating Decimal objects, which is
separate from string-to-Decimal converter that the standard requires
(and which is provided by the Context objects)?

-- Michael Chermside


From gvanrossum at gmail.com  Fri May 20 20:43:37 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 20 May 2005 11:43:37 -0700
Subject: [Python-Dev] PEP 344: Exception Chaining and Embedded Tracebacks
In-Reply-To: <Pine.LNX.4.58.0505191948040.4932@server1.LFW.org>
References: <Pine.LNX.4.58.0505161634540.14555@server1.LFW.org>
	<ca471dc20505161611497a25b7@mail.gmail.com>
	<Pine.LNX.4.58.0505191948040.4932@server1.LFW.org>
Message-ID: <ca471dc2050520114333de7a7f@mail.gmail.com>

[Guido]
> > Here's a bunch of commentary:

[Ping]
> Thanks.  Sorry it's taken me a couple of days to get back to this.
> I think i'm caught up on the mail now.

No problem!

<snip>

> > Also, in that same example, according to your specs, the TypeError
> > raised by bar() has the ZeroDivisionError raised in foo() as its
> > context.  Do we really want this?
> 
> I don't think it's absolutely necessary, though it doesn't seem to
> hurt.  We agree that if the TypeError makes it up to foo's frame,
> it should have the ZeroDivisionError as its __context__, right?

Yes.

> If so, do i understand correctly that you want the __context__ to
> depend on where the exception is caught as well as where it is raised?

I think so, but that's not how I think about it.  IMO the only time
when the context becomes *relevant* is when a finally/except clause is
left with a different exception than it was entered.

> In your thinking, is this mainly a performance or a cleanliness issue?

Hard to say; the two are often difficult to separate for me.  The
performance in this case bothers me because it means unnecessary churn
when exceptions are raised and caught.  I know I've said in the past I
don't care about the performance of exceptions, but that's not *quite*
true, given that they are quite frequently used for control flow
(e.g. StopIteration).  I don't know how to quantify the performance
effect though (unless it means that exceptions would have to be
*instantiated* sooner than currently; in those cases where exception
instantiation is put off, it is put off for one reason, and that
reason is performance!

The cleanliness issue is also important to me: when I have some
isolated code that raises and successfully catches an exception, and
that code happens to be used by a logging operation that is invoked
from an exception handler, why would the context of the exception have
to include something that happened way deeper on the stack and that is
totally irrelevant to the code that catches *my* exception?

> Basically i was looking for the simplest description that would
> guarantee ending up with all the relevant tracebacks reported in
> chronological order.  I thought it would be more complicated if we
> had to keep modifying the traceback on the way up, but now that
> i've re-learned how tracebacks are constructed, it's moot -- we're
> already extending the traceback on the way through each frame.
> 
> I have a proposal for the implicit chaining semantics that i'll post
> in a separate message so it isn't buried in the middle of this one.

OK, looking forward to it.

> > Do we really need new syntax to set __cause__?  Java does this without
> > syntax by having a standard API initCause() (as well as constructors
> > taking a cause as argument; I understand why you don't want to rely on
> > that -- neither does Java).  That seems more general because it can be
> > used outside the context of a raise statement.
> 
> I went back and forth on this.  An earlier version of the PEP actually
> proposes a 'setcause' method.  I eventually settled on a few reasons
> for the "raise ... from" syntax:
> 
>     1.  (major) No possibility of method override; no susceptibility
>         to manipulation of __dict__ or __getattr__; no possibility of
>         another exception happening while trying to set the cause.

Hm; the inability to override the method actually sounds like a major
disadvantage.  I'm not sure what you mean with __dict__ or __getattr__
except other ways to tweak the attribute assignment; maybe you forgot
about descriptors?  Another exception could *still* happen and I think
the __context__ setting mechanism will take care of it just fine.

>     2.  (moderate) There is a clear, distinct idiom for exception
>         replacement requiring that the cause and effect must be
>         identified together at the point of raising.

Well, nothing stops me from adding a setCause() method to my own
exception class and using that instead of the from syntax, right?  I'm
not sure why it is so important to have a distinct idiom, and even if
we do, I think that a method call will do just fine:

  except EnvironmentError, err:
      raise MyApplicationError("boo hoo").setCause(err)

>     3.  (minor) No method namespace pollution.
> 
>     4.  (minor) Less typing, less punctuation.
> 
> The main thing is that handling exceptions is a delicate matter, so it's
> nice to have guarantees that the things you're doing aren't going to
> suddenly raise more exceptions.

I don't see that there's all that much that can go wrong in
setCause().  After all, it's the setCause() of the *new* exception
(which you can know and trust) that could cause trouble; a boobytrap
in the exception you just caught could not possibly be set off by
simply using it as a cause.  (It can cause the traceback printing to
fail, of course, but that's not a new issue.)

> > Why insert a blank line between chained tracebacks?
> 
> Just to make them easier to read.  The last line of each traceback
> is important because it identifies the exception type, and that will
> be a lot easier to find if it isn't buried in an uninterrupted stream
> of lines.

Yeah, but those lines follow an easy-to-recognize pattern with
alternating "File" lines and source code lines, always indented 2
resp. 4 spaces; anything different is easily found.

> > I might want to add an extra line at the very end (and
> > perhaps at each chaining point) warning the user that the exception
> > has a chained counterpart that was printed earlier.
> 
> How about if the line says how many exceptions there were?  Like:
> 
>     [This is the last of 5 exceptions; see above for the others.]

Something like that would be very helpful indeed.

> > Why should the C level APIs not automatically set __context__?  (There
> > may be an obvious reason but it doesn't hurt stating it.)
> 
> Because:
> 
>     (a) you indicated some discomfort with the idea, perhaps because
>         it would make the interpreter do unnecessary work;
>     (b) no one seems to be asking for it;
>     (c) it seems potentially complicated.

None of these seem very good reasons if we were to go with your
original design. :-)

> However, if we go for the semantics you want, PyErr_Set* wouldn't set
> __context__ at the moment of raising anyway.  If __context__ is set
> during unwinding, then i expect it would get set on exceptions raised
> from C too, since the interpreter wouldn't know the difference.

Probably true.  It's worth looking at the implementation in detail (I
don't have it in my head and not enough time to look it up myself).

> > I was surprised to learn that yield clears the exception state; I
> > wonder if this isn't a bug in the generator implementation?  IMO
> > better semantics would be for the exception state to survive across
> > yield.
> 
> I agree -- i just didn't want to tackle that issue in this PEP.
> It could be considered a separate enhancement/bugfix.

Perhaps.  You might look into why this is -- I wonder if it isn't an
accident of the generator implementation (like ceval() might be
clearing the exception info upon entry even if it's resuming a frame
-- no idea if that's the case without doing more research than I have
time for).

> > I don't like the example (in "Open Issues") of applications wrapping
> > arbitrary exceptions in ApplicationError.  I consider this bad style,
> > even if the chaining makes it not quite as bad as it used to be.
> 
> Isn't it a reasonable possibility that, as part of its contract, a
> library will want to guarantee that it only raises exceptions of
> certain types?

Yeah, but if you apply that recursively, it should only have to
*catch* exceptions of certain types when it is calling other code.
E.g. it's fine to catch EnvironmentError (which is basically IOError +
os.error) when you're manipulating files; but it's not okay to catch
all exceptions and wrap them.  I've seen too many ApplicationErrors
that were either hiding bugs in the application, or trapping
KeyboardInterrupt and similar ones.  MemoryError, SystemExit and
SystemError also shouldn't be wrapped.

I don't think it's a good idea to try to guarantee "this application
only ever raises ApplicationError".  A better guarantee is "errors
that this application *detects* will always be reported as
ApplicationError.  IMO bugs in the application should *never* be
wrapped in ApplicationError.

> > I don't see the need for "except *, exc" -- assuming all exceptions
> > derive from a single base class, we can just write the name of that
> > base class.
> 
> If we get there, yes.  But at the moment, i don't believe there's any
> way to catch an arbitrary string exception or an exception of a
> non-Exception instance other than "except:".

Sure.  But since we all seem to be agreeing on eventually making all
exceptions derive from a common base class, we shouldn't be inventing
new syntax that will later become redundant.

> > I don't like having sys.exception; instead, the only way to access the
> > "current" exception ought to be to use an except clause with a
> > variable.  (sys.last_exception is fine.)
> 
> That would be nice, again once we have all exceptions derive from
> Exception.  It seems to me we'd have to do these changes in this order:
> 
>     (a) ban string exceptions
>     (b) require all exceptions to derive from Exception
>     (c) ban bare "except:"
>     (d) eliminate sys.exc_*
> 
> Or do them all at once in Python 3000.  (Well, i guess all that is just
> repeating what Brett has talked about putting in his exception PEP.)

I guess it's a separate topic.  I'd like your PEP to focus in the
ideal Python 3000 semantics first; once we agree on that we can
discuss how to get there from here.

> > I hope that this can be accepted together with the son-of-PEP-343 (PEP
> > 343 plus generator exception injection and finalization) so __exit__
> > can take a single exception argument from the start.  (But what should
> > it receive if a string exception is being caught?  A triple perhaps?)
> 
> Dare i suggest... a string subclass with a __traceback__ attribute?
> 
> A string subclass (that also subclasses Exception) might be a migration
> path to eliminating string exceptions.

Hardly.  With string exceptions, the *identity* of the string object
decides the identity of the exception caught (matching uses 'is' not
'==').

I'd rather just leave string exceptions alone and say certain features
don't work if you use them -- that'll be an encouragement for people
to rip them out.  If you want to do *anything* about them, you might
create a family of Exception subclasses, one subclass per string
object.  Implementing this would be a nice exercise using metaclasses.
Suppose you have a class factory named StringExceptionClass that takes
a string object and returns the appropriate StringException subclass
(maintaining a global dict mapped by object identity so it returns the
same class if the same object is passed).  Then

  raise "abc", XYZ

could be translated into

  raise StringExceptionClass("abc")(XYZ)

and

  except "abc", err:

could be equivalent to

  except StringExceptionClass("abc"), err:

The remaining incompatibility would be that err would hold a
StringException instance rather than just the value of XYZ.  Or the
VM's except handling code could pull the value out of the exception and
store it in err for full compatibility.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From facundobatista at gmail.com  Fri May 20 22:47:56 2005
From: facundobatista at gmail.com (Facundo Batista)
Date: Fri, 20 May 2005 17:47:56 -0300
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <1f7befae050519205524f7a087@mail.gmail.com>
References: <e04bdf3105051906532a3b685d@mail.gmail.com>
	<000901c55ce3$819849e0$2db0958d@oemcomputer>
	<1f7befae050519205524f7a087@mail.gmail.com>
Message-ID: <e04bdf31050520134750c3b22d@mail.gmail.com>

On 5/20/05, Tim Peters <tim.peters at gmail.com> wrote:


> That's not my interest here.  My interest is in a consistent,

Point. Every time I explain Decimal, I have to say "always the context
is applied EXCEPT at construction time".


> > If someone wants to craft another module to wrap around and
> > candy-coat the Decimal API, I would be all for it.
> 
> For example, Facundo is doing that with a money class, yes?  That's

Yes, and so far it's pretty much "Hey! Let's take Decimal and define
how we configure and use it".


.    Facundo

Blog: http://www.taniquetil.com.ar/plog/
PyAr: http://www.python.org/ar/

From python at rcn.com  Fri May 20 22:52:22 2005
From: python at rcn.com (Raymond Hettinger)
Date: Fri, 20 May 2005 16:52:22 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <1f7befae050520091978d2a891@mail.gmail.com>
Message-ID: <000101c55d7d$d2967ca0$ec07a044@oemcomputer>

> Does it really need to be argued interminably that deviating from a
> standard is a Big Deal? 

The word deviate inaccurately suggests that we do not have a compliant
method which, of course, we do.  There are two methods, one context
aware and the other context free.  The proposal is to change the
behavior of the context free version, treat it as a bug, and alter it in
the middle of a major release.  The sole argument resembles bible
thumping.

Now for a tale.  Once upon a time, one typed the literal 1.1 but ended
up with the nearest representable value, 1.1000000000000001.  The
representation error monster terrorized the land and there was much
sadness. 

>From the mists of Argentina, a Palidan set things right.  The literal
1.1 became representable and throughout the land the monster was
believed to have been slain.  With their guard down, no one thought
twice when a Zope sorcerer had the bright idea that long literals like
1.1000000000000001 should no longer be representable and should
implicitly jump to the nearest representable value, 1.1.  Thus the
monster arose like a Phoenix.  Because it was done in a bugfix release,
without a PEP, and no public comment, the citizens were caught
unprepared and faced an eternity dealing with the monster so valiantly
assailed by the Argentine.

Bible thumping notwithstanding, this change is both unnecessary and
undesirable.  Implicit rounding in the face of explicit user input to
the contrary is a bad idea.  Internally, the implementation relies on
the existing behavior so it is not easily changed.  Don't do it.



Raymond

From facundobatista at gmail.com  Fri May 20 23:23:25 2005
From: facundobatista at gmail.com (Facundo Batista)
Date: Fri, 20 May 2005 18:23:25 -0300
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <20050520103432.um95cugr4zhcg44k@login.werra.lunarpages.com>
References: <20050520103432.um95cugr4zhcg44k@login.werra.lunarpages.com>
Message-ID: <e04bdf31050520142329527b34@mail.gmail.com>

On 5/20/05, Michael Chermside <mcherm at mcherm.com> wrote:

> In other words, Java's behavior is much closer to the current behavior
> of Python, at least in terms of features that are user-visible. The
> default behavior in Java is to have infinite precision unless a context
> is supplied that says otherwise. So the constructor that takes a string
> converts it faithfully, while the constructor that takes a context
> obeys the context.

Are we hitting that point where the most important players (Python and
Java, ;) implement the standard almost fully compliant, and then the
standard revises *that* behaviour?

For the record, I'm -0 for changing the actual behaviour: I'd really
like to implement exactly the Spec, but I think it's more important
the practical reasons we have to don't do it.

.    Facundo

Blog: http://www.taniquetil.com.ar/plog/
PyAr: http://www.python.org/ar/

From foom at fuhm.net  Fri May 20 23:33:15 2005
From: foom at fuhm.net (James Y Knight)
Date: Fri, 20 May 2005 17:33:15 -0400
Subject: [Python-Dev] PEP 344: Explicit vs. Implicit Chaining
In-Reply-To: <Pine.LNX.4.58.0505200330240.4932@server1.LFW.org>
References: <Pine.LNX.4.58.0505200330240.4932@server1.LFW.org>
Message-ID: <5A45454A-B923-4782-907A-288C3EEDCF1B@fuhm.net>


On May 20, 2005, at 4:31 AM, Ka-Ping Yee wrote:

> Guido van Rossum wrote:
>
>> Do we really need both __context__ and __cause__?
>>
>
> Well, it depends whose needs we're trying to meet.
>
> If we want to satisfy those who have been asking for chaining
> of unexpected secondary exceptions, then we have to provide that
> on some attribute.

I still don't see why people think the python interpreter should be  
automatically providing __context__. To me it seems like it'll just  
clutter things up for no good reason. If you really want the other  
exception, you can access it via the local variable in the frame  
where it was first caught. Of course right now you don't get a  
traceback, but the proposal fixes that.

 >>> def test():
...  try:
...   1/0
...  except Exception, e:
...   y
...
 >>> test()
Traceback (most recent call last):
   File "<stdin>", line 1, in ?
   File "<stdin>", line 5, in test
NameError: global name 'y' is not defined
 >>> pdb.pm()
 > <stdin>(5)test()
(Pdb) locals()
{'e': <exceptions.ZeroDivisionError instance at 0x73198>}

James


From tim.peters at gmail.com  Fri May 20 23:40:48 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Fri, 20 May 2005 17:40:48 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <000101c55d7d$d2967ca0$ec07a044@oemcomputer>
References: <1f7befae050520091978d2a891@mail.gmail.com>
	<000101c55d7d$d2967ca0$ec07a044@oemcomputer>
Message-ID: <1f7befae05052014406c9e2948@mail.gmail.com>

[Raymond Hettinger]
> The word deviate inaccurately suggests that we do not have
> a compliant method which, of course, we do.  There are two
> methods, one context aware and the other context free.  The
> proposal is to change the behavior of the context free version,
> treat it as a bug, and alter it in the middle of a major release.

I didn't suggest changing this for 2.4.2.  Although, now that you
mention it ... <wink>.

>  The sole argument resembles bible thumping.

I'm sorry, but if you mentally reduced everything I've written about
this to "the sole argument", rational discussion has become impossible
here.

In the meantime, I've asked Mike Cowlishaw what his intent was, and
what the standard may eventually say.  I didn't express a preference
to him.  He said he'll think about it and try to get back to me by
Sunday.

From arigo at tunes.org  Fri May 20 23:44:29 2005
From: arigo at tunes.org (Armin Rigo)
Date: Fri, 20 May 2005 23:44:29 +0200
Subject: [Python-Dev] First PyPy (preview) release
Message-ID: <20050520214429.GB26290@code1.codespeak.net>


The PyPy 0.6 release
-------------------- 

*The PyPy Development Team is happy to announce the first 
public release of PyPy after two years of spare-time and
half a year of EU funded development.  The 0.6 release 
is eminently a preview release.*  

What it is and where to start 
-----------------------------

Getting started:    http://codespeak.net/pypy/index.cgi?doc/getting_started.html

PyPy Documentation: http://codespeak.net/pypy/index.cgi?doc

PyPy Homepage:      http://codespeak.net/pypy/

PyPy is a MIT-licensed reimplementation of Python written in
Python itself.  The long term goals are an implementation that
is flexible and easy to experiment with and retarget to
different platforms (also non-C ones) and such that high
performance can be achieved through high-level implementations
of dynamic optimisation techniques.

The interpreter and object model implementations shipped with 0.6 can
be run on top of CPython and implement the core language features of
Python as of CPython 2.3.  PyPy passes around 90% of the Python language
regression tests that do not depend deeply on C-extensions.  Some of
that functionality is still made available by PyPy piggy-backing on
the host CPython interpreter.  Double interpretation and abstractions
in the code-base make it so that PyPy running on CPython is quite slow
(around 2000x slower than CPython ), this is expected.  

This release is intended for people that want to look and get a feel
into what we are doing, playing with interpreter and perusing the
codebase.  Possibly to join in the fun and efforts.

Interesting bits and highlights
---------------------------------

The release is also a snap-shot of our ongoing efforts towards 
low-level translation and experimenting with unique features. 

* By default, PyPy is a Python version that works completely with
  new-style-classes semantics.  However, support for old-style classes
  is still available.  Implementations, mostly as user-level code, of
  their metaclass and instance object are included and can be re-made
  the default with the ``--oldstyle`` option.

* In PyPy, bytecode interpretation and object manipulations 
  are well separated between a bytecode interpreter and an 
  *object space* which implements operations on objects. 
  PyPy comes with experimental object spaces augmenting the
  standard one through delegation:

  * an experimental object space that does extensive tracing of
    bytecode and object operations;

  * the 'thunk' object space that implements lazy values and a 'become'
    operation that can exchange object identities.
  
  These spaces already give a glimpse in the flexibility potential of
  PyPy.  See demo/fibonacci.py and demo/sharedref.py for examples
  about the 'thunk' object space.

* The 0.6 release also contains a snapshot of our translation-efforts 
  to lower level languages.  For that we have developed an
  annotator which is capable of infering type information
  across our code base.  The annotator right now is already
  capable of successfully type annotating basically *all* of
  PyPy code-base, and is included with 0.6.  

* From type annotated code, low-level code needs to be generated.
  Backends for various targets (C, LLVM,...) are included; they are
  all somehow incomplete and have been and are quite in flux. What is
  shipped with 0.6 is able to deal with more or less small/medium examples.


Ongoing work and near term goals
---------------------------------

Generating low-level code is the main area we are hammering on in the
next months; our plan is to produce a PyPy version in August/September 
that does not need to be interpreted by CPython anymore and will 
thus run considerably faster than the 0.6 preview release. 

PyPy has been a community effort from the start and it would
not have got that far without the coding and feedback support
from numerous people.   Please feel free to give feedback and 
raise questions. 

    contact points: http://codespeak.net/pypy/index.cgi?contact

    contributor list: http://codespeak.net/pypy/index.cgi?doc/contributor.html 

have fun, 

    Armin Rigo, Samuele Pedroni, 

    Holger Krekel, Christian Tismer, 

    Carl Friedrich Bolz 


    PyPy development and activities happen as an open source project  
    and with the support of a consortium funded by a two year EU IST 
    research grant. Here is a list of partners of the EU project: 
        
        Heinrich-Heine University (Germany), AB Strakt (Sweden)

        merlinux GmbH (Germany), tismerysoft GmbH(Germany) 

        Logilab Paris (France), DFKI GmbH (Germany)

        ChangeMaker (Sweden)


From gvanrossum at gmail.com  Fri May 20 23:45:40 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 20 May 2005 14:45:40 -0700
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <e04bdf31050520142329527b34@mail.gmail.com>
References: <20050520103432.um95cugr4zhcg44k@login.werra.lunarpages.com>
	<e04bdf31050520142329527b34@mail.gmail.com>
Message-ID: <ca471dc20505201445402a1e35@mail.gmail.com>

It looks like if you pass in a context, the Decimal constructor still
ignores that context:

>>> import decimal as d
>>> d.getcontext().prec = 4
>>> d.Decimal("1.2345678901234567890123456789012345678901234567890000",
d.getcontext())
Decimal("1.2345678901234567890123456789012345678901234567890000")
>>> 

I think this is contrary to what some here have claimed (that you
could pass an explicit context to cause it to round according to the
context's precision).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From tim.peters at gmail.com  Fri May 20 23:51:47 2005
From: tim.peters at gmail.com (Tim Peters)
Date: Fri, 20 May 2005 17:51:47 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <ca471dc20505201445402a1e35@mail.gmail.com>
References: <20050520103432.um95cugr4zhcg44k@login.werra.lunarpages.com>
	<e04bdf31050520142329527b34@mail.gmail.com>
	<ca471dc20505201445402a1e35@mail.gmail.com>
Message-ID: <1f7befae0505201451142e6269@mail.gmail.com>

[Guido]
> It looks like if you pass in a context, the Decimal constructor
> still ignores that context:
> 
> >>> import decimal as d
> >>> d.getcontext().prec = 4
> >>> d.Decimal("1.2345678901234567890123456789012345678901234567890000",
> d.getcontext())
> Decimal("1.2345678901234567890123456789012345678901234567890000")
> >>>
> 
> I think this is contrary to what some here have claimed (that you
> could pass an explicit context to cause it to round according to the
> context's precision).

I think Michael Chermside said that's how a particular Java
implementation works.

Python's Decimal constructor accepts a context argument, but the only
use made of it is to possibly signal a ConversionSyntax condition.

From pje at telecommunity.com  Sat May 21 00:37:16 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Fri, 20 May 2005 18:37:16 -0400
Subject: [Python-Dev] PEP 344: Explicit vs. Implicit Chaining
In-Reply-To: <5A45454A-B923-4782-907A-288C3EEDCF1B@fuhm.net>
References: <Pine.LNX.4.58.0505200330240.4932@server1.LFW.org>
	<Pine.LNX.4.58.0505200330240.4932@server1.LFW.org>
Message-ID: <5.1.1.6.0.20050520183332.0349eec0@mail.telecommunity.com>

At 05:33 PM 5/20/2005 -0400, James Y Knight wrote:
>I still don't see why people think the python interpreter should be
>automatically providing __context__.

Because it's a pain when you have an error in your error handler, and are 
thus unable to debug the original.  (pdb isn't always an option, either; 
see later below)


>To me it seems like it'll just
>clutter things up for no good reason.

Only in cases like this:

     def __getattr__(self,attr):
         try:
             return self.__extras[attr]
         except KeyError:
             raise AttributeError

And in that particular case it's only adding a 1-line traceback.  I haven't 
thought of any more complex examples yet.


>  If you really want the other
>exception, you can access it via the local variable in the frame
>where it was first caught. Of course right now you don't get a
>traceback, but the proposal fixes that.

This only helps if you can get to a debugger.  What if you're reading your 
web server's error log?


From mcherm at mcherm.com  Sat May 21 02:07:26 2005
From: mcherm at mcherm.com (Michael Chermside)
Date: Fri, 20 May 2005 17:07:26 -0700
Subject: [Python-Dev] Adventures with Decimal
Message-ID: <20050520170726.luoabo8keh5wwws4@login.werra.lunarpages.com>

Guido writes:
> It looks like if you pass in a context, the Decimal constructor still
> ignores that context

No, you just need to use the right syntax. The correct syntax for
converting a string to a Decimal using a context object is to use
the create_decimal() method of the context object:

>>> import decimal
>>> decimal.getcontext().prec = 4
>>> decimal.getcontext().create_decimal("1.234567890")
Decimal("1.235")

Frankly, I have no idea WHAT purpose is served by passing a context
to the decimal constructor... I didn't even realize it was allowed!

-- Michael Chermside


From gvanrossum at gmail.com  Sat May 21 01:13:16 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Fri, 20 May 2005 16:13:16 -0700
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <1f7befae0505201451142e6269@mail.gmail.com>
References: <20050520103432.um95cugr4zhcg44k@login.werra.lunarpages.com>
	<e04bdf31050520142329527b34@mail.gmail.com>
	<ca471dc20505201445402a1e35@mail.gmail.com>
	<1f7befae0505201451142e6269@mail.gmail.com>
Message-ID: <ca471dc205052016132bd84018@mail.gmail.com>

> [Guido]
> > It looks like if you pass in a context, the Decimal constructor
> > still ignores that context:
> >
> > >>> import decimal as d
> > >>> d.getcontext().prec = 4
> > >>> d.Decimal("1.2345678901234567890123456789012345678901234567890000",
> > d.getcontext())
> > Decimal("1.2345678901234567890123456789012345678901234567890000")
> > >>>
> >
> > I think this is contrary to what some here have claimed (that you
> > could pass an explicit context to cause it to round according to the
> > context's precision).

[Tim]
> I think Michael Chermside said that's how a particular Java
> implementation works.
> 
> Python's Decimal constructor accepts a context argument, but the only
> use made of it is to possibly signal a ConversionSyntax condition.

You know that, but Raymond seems confused.  From one of his posts (point (k)):

"Throughout the implementation, the code calls the Decimal
constructor to create intermediate values.  Every one of those calls
would need to be changed to specify a context."

But passing a context doesn't help for obtaining the desired precision.

PS I also asked Cowlishaw and he said he would ponder it over the
weekend. Maybe Raymond can mail him too. ;-)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From facundobatista at gmail.com  Sat May 21 02:53:33 2005
From: facundobatista at gmail.com (Facundo Batista)
Date: Fri, 20 May 2005 20:53:33 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <ca471dc205052016132bd84018@mail.gmail.com>
References: <20050520103432.um95cugr4zhcg44k@login.werra.lunarpages.com>
	<e04bdf31050520142329527b34@mail.gmail.com>
	<ca471dc20505201445402a1e35@mail.gmail.com>
	<1f7befae0505201451142e6269@mail.gmail.com>
	<ca471dc205052016132bd84018@mail.gmail.com>
Message-ID: <e04bdf31050520175333b4a3a@mail.gmail.com>

On 5/20/05, Guido van Rossum <gvanrossum at gmail.com> wrote:

> > Python's Decimal constructor accepts a context argument, but the only
> > use made of it is to possibly signal a ConversionSyntax condition.
> 
> You know that, but Raymond seems confused.  From one of his posts (point (k)):
> 
> "Throughout the implementation, the code calls the Decimal
> constructor to create intermediate values.  Every one of those calls
> would need to be changed to specify a context."

The point here, I think, is that intermediate Decimal objects are
created, and the whole module assumes that the context does not affect
that intermediate values. If you change this and start using the
context at Decimal creation time, you'll have to be aware of that in a
lot of parts of the code.

OTOH, you can change that and run the test cases, and see how bad it
explodes (or not, ;).

.    Facundo

Blog: http://www.taniquetil.com.ar/plog/
PyAr: http://www.python.org/ar/

From python at rcn.com  Sat May 21 03:51:55 2005
From: python at rcn.com (Raymond Hettinger)
Date: Fri, 20 May 2005 21:51:55 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <20050520170726.luoabo8keh5wwws4@login.werra.lunarpages.com>
Message-ID: <000801c55da7$ac136140$871dc797@oemcomputer>

[Michael Chermside]
> Frankly, I have no idea WHAT purpose is served by passing a context
> to the decimal constructor... I didn't even realize it was allowed!

Quoth the docs for the Decimal constructor:

"""
The context precision does not affect how many digits are stored. That
is determined exclusively by the number of digits in value. For example,
"Decimal("3.00000")" records all five zeroes even if the context
precision is only three. 

The purpose of the context argument is determining what to do if value
is a malformed string. If the context traps InvalidOperation, an
exception is raised; otherwise, the constructor returns a new Decimal
with the value of NaN.

"""



Raymond

From ncoghlan at gmail.com  Sat May 21 03:56:40 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 21 May 2005 11:56:40 +1000
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <20050520170726.luoabo8keh5wwws4@login.werra.lunarpages.com>
References: <20050520170726.luoabo8keh5wwws4@login.werra.lunarpages.com>
Message-ID: <428E9558.5090705@gmail.com>

Michael Chermside wrote:
> Frankly, I have no idea WHAT purpose is served by passing a context
> to the decimal constructor... I didn't even realize it was allowed!

As Tim pointed out, it's solely to control whether or not ConversionSyntax 
errors are exceptions or not:

Py> decimal.Decimal("a")
Traceback (most recent call last):
   File "<stdin>", line 1, in ?
   File "c:\python24\lib\decimal.py", line 571, in __new__
     self._sign, self._int, self._exp = context._raise_error(ConversionSyntax)
   File "c:\python24\lib\decimal.py", line 2266, in _raise_error
     raise error, explanation
decimal.InvalidOperation
Py> context = decimal.getcontext().copy()
Py> context.traps[decimal.InvalidOperation] = False
Py> decimal.Decimal("a", context)
Decimal("NaN")

I'm tempted to suggest deprecating the feature, and say if you want invalid 
strings to produce NaN, use the create_decimal() method of Context objects. That 
would mean the standard construction operation becomes genuinely context-free. 
Being able to supply a context, but then have it be mostly ignored is rather 
confusing.

Doing this may also fractionally speed up Decimal creation from strings in the 
normal case, as the call to getcontext() could probably be omitted from the 
constructor.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From raymond.hettinger at verizon.net  Sat May 21 03:56:28 2005
From: raymond.hettinger at verizon.net (Raymond Hettinger)
Date: Fri, 20 May 2005 21:56:28 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <e04bdf31050520175333b4a3a@mail.gmail.com>
Message-ID: <000901c55da8$4e2a2180$871dc797@oemcomputer>

[Guido]
> > You know that, but Raymond seems confused.  From one of his posts
(point
> (k)):

[Raymond]
> > "Throughout the implementation, the code calls the Decimal
> > constructor to create intermediate values.  Every one of those calls
> > would need to be changed to specify a context."

[Facundo]
> The point here, I think, is that intermediate Decimal objects are
> created, and the whole module assumes that the context does not affect
> that intermediate values. If you change this and start using the
> context at Decimal creation time, you'll have to be aware of that in a
> lot of parts of the code.
> 
> OTOH, you can change that and run the test cases, and see how bad it
> explodes (or not, ;).

Bingo!

That is point (k) from the big missive.


Raymond


From raymond.hettinger at verizon.net  Sat May 21 03:56:28 2005
From: raymond.hettinger at verizon.net (Raymond Hettinger)
Date: Fri, 20 May 2005 21:56:28 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <ca471dc20505201445402a1e35@mail.gmail.com>
Message-ID: <000a01c55da8$4eb809a0$871dc797@oemcomputer>

> It looks like if you pass in a context, the Decimal constructor still
> ignores that context:
> 
> >>> import decimal as d
> >>> d.getcontext().prec = 4
> >>>
d.Decimal("1.2345678901234567890123456789012345678901234567890000",
> d.getcontext())
> Decimal("1.2345678901234567890123456789012345678901234567890000")
> >>>
> 
> I think this is contrary to what some here have claimed (that you
> could pass an explicit context to cause it to round according to the
> context's precision).

That's not the way it is done.  The context passed to the Decimal
constructor is *only* used to determine what to do with a malformed
string (whether to raise an exception or set a flag.

To create a decimal with a context, use the Context.create_decimal()
method:

>>> import decimal as d
>>> d.getcontext().prec = 4
>>>
d.getcontext().create_decimal("1.234567890123456789012345678901234567890
1234567890000")
Decimal("1.235")



Raymond


From raymond.hettinger at verizon.net  Sat May 21 03:56:28 2005
From: raymond.hettinger at verizon.net (Raymond Hettinger)
Date: Fri, 20 May 2005 21:56:28 -0400
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <1f7befae05052014406c9e2948@mail.gmail.com>
Message-ID: <000b01c55da8$4f25e6a0$871dc797@oemcomputer>

[Tim]
> I'm sorry, but if you mentally reduced everything I've written about
> this to "the sole argument", rational discussion has become impossible
> here.

Forgive me one melodramatic email.

I've laid out my reasoning and understand yours.

Crossing light sabers with one such as yourself is of course a foolhardy
undertaking.

A root difference is that I believe we have both a compliant
implementation (using Context.create_decimal) and a practical context
free extension in the form of the regular Decimal constructor.

A second difference is that you see harm in allowing any context free
construction while I see greater harm from re-introducing representation
error when that is what we were trying to fix in the first place.

The rest is just practicalities and engineering (altering decimal's
internals may be a non-trivial undertaking).

May the force be with you,


Raymond


From aahz at pythoncraft.com  Sat May 21 07:24:58 2005
From: aahz at pythoncraft.com (Aahz)
Date: Fri, 20 May 2005 22:24:58 -0700
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <000401c55d0f$06e9b7c0$2db0958d@oemcomputer>
References: <000a01c55d04$20eecb20$2db0958d@oemcomputer>
	<000401c55d0f$06e9b7c0$2db0958d@oemcomputer>
Message-ID: <20050521052458.GA8579@panix.com>

On Fri, May 20, 2005, Raymond Hettinger wrote:
>
> k.) The biggest client of all these methods is the Decimal module
> itself.  Throughout the implementation, the code calls the Decimal
> constructor to create intermediate values.  Every one of those calls
> would need to be changed to specify a context.  Some of those cases are
> not trivially changed (for instance, the hash method doesn't have a
> context but it needs to check to see if a decimal value is exactly an
> integer so it can hash to that value).  Likewise, how do you use a
> decimal value for a dictionary key when the equality check is context
> dependent (change precision and lose the ability to reference an entry)?

I'm not sure this is true, and if it is true, I think the Decimal module
is poorly implemented.  There are two uses for the Decimal() constructor:

* copy constructor for an existing Decimal instance (or passing in a
tuple directly to mimic the barebones internal)

* conversion constructor for other types, such as string

Are you claiming that the intermediate values are being constructed as
strings and then converted back to Decimal objects?  Is there something
else I'm missing?  I don't think Tim is claiming that the copy
constructor needs to obey context, just string conversions.

Note that comparison is not context-dependent, because context only
applies to results of operations, and the spec's comparison operator
(equivalent to cmp()) only returns (-1,0,1) -- guaranteed to be within
the precision of any context.  ;-)

Note that hashing is not part of the standard, so whatever makes most
sense in a Pythonic context would be appropriate.  It's perfectly
reasonable for Decimal's __int__ method to be unbounded because Python
ints are unbounded.

All these caveats aside, I don't have a strong opinion about what we
should do.  Overall, my sentiments are with Tim that we should fix this,
but my suspicion is that it probably doesn't matter much.
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"The only problem with Microsoft is they just have no taste." --Steve Jobs

From greg.ewing at canterbury.ac.nz  Sat May 21 11:20:38 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 21 May 2005 21:20:38 +1200
Subject: [Python-Dev] Adventures with Decimal
References: <000101c55d7d$d2967ca0$ec07a044@oemcomputer>
Message-ID: <428EFD66.4090900@canterbury.ac.nz>

Raymond Hettinger wrote:

>>From the mists of Argentina, a Palidan set things right.  The literal
> 1.1 became representable and throughout the land the monster was
> believed to have been slain.

I don't understand. Isn't the monster going to pop
right back up again as soon as anyone does any
arithmetic with the number?

I don't see how you can regard what Decimal does
as "schoolbook arithmetic" unless the teacher is
reaching over your shoulder and blacking out any
excess digits after everything you do.

And if that's acceptable, I don't see how it
helps significantly to have just the very first
step -- turning the input into numbers -- be
exempt from this behaviour. If anything, people
are going to be even more confused. "But it
can obviously cope with 1.1000000000000000001,
so why does it give the wrong answer when I add
something to it?"

Greg


From p.f.moore at gmail.com  Sat May 21 14:12:50 2005
From: p.f.moore at gmail.com (Paul Moore)
Date: Sat, 21 May 2005 13:12:50 +0100
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <000b01c55da8$4f25e6a0$871dc797@oemcomputer>
References: <1f7befae05052014406c9e2948@mail.gmail.com>
	<000b01c55da8$4f25e6a0$871dc797@oemcomputer>
Message-ID: <79990c6b05052105126197e473@mail.gmail.com>

On 5/21/05, Raymond Hettinger <raymond.hettinger at verizon.net> wrote:
> A root difference is that I believe we have both a compliant
> implementation (using Context.create_decimal) and a practical context
> free extension in the form of the regular Decimal constructor.

Please forgive an intrusion by someone who has very little knowledge
of floating point pitfalls.

My mental model of Decimal is "pocket calculator arithmetic" (I
believe this was originally prompted by Tim, as I had previously been
unaware that calculators used decimal hardware). In that model, fixed
precision is the norm - it's the physical number of digits the box
displays. And setting the context is an extremely rare operation - it
models swapping to a different device (something I do do in real life,
when I have an 8-digit box and am working with numbers bigger than
that - but with Decimal, the model is a 28-digit box by default, and
that's big enough for me!)

Construction models typing a number in, and this is where the model
breaks down. On a calculator, you physically cannot enter a number
with more digits than the precision, so converting a string with
excess precision doesn't come into it. And yet, Decimal('...') is the
"obvious" constructor, and should do what people "expect".

In many ways, I could happily argue for an exception if the string has
too many digits. I could also argue for truncation (as that's what
many calculators actually do - ignore any excess typing). No
calculator rounds excess input, but I can accept it as what they might
well do if was physically possible. And of course, in a practical
sense, I'll be working with 28-digit precision, so I'll never hit the
situation in any case, and I don't care :-)

> A second difference is that you see harm in allowing any context free
> construction while I see greater harm from re-introducing representation
> error when that is what we were trying to fix in the first place.

The types of rounding errors (to use the naive term deliberately)
decimal suffer from are far more familiar to people because they use
calculators. With a calculator, I'm *used* to (1/3) * 3 not coming out
as exactly 1. And indeed we have

>>> (Decimal(1)/Decimal(3))*Decimal(3)
Decimal("0.9999999999999999999999999999")

Now try that with strings:

>>> (Decimal("1")/Decimal("3"))*Decimal("3")
Decimal("0.9999999999999999999999999999")
>>> (Decimal("1.0")/Decimal("3.0"))*Decimal("3.0")
Decimal("0.9999999999999999999999999999")

Nope, I don't see anything surprising.

After a bit more experimentation, I'm unable to make *anything*
surprise me, using either Decimal() or getcontext().create_decimal().
Of course, I've never bothered typing enough digits that I care about
(trailing zeroes don't count!) to trigger the rounding behaviour of
the constructor that matters here, but I don't ever epect to in real
life.

Apologies for the rambling discussion - it helped me as a non-expert
to understand what the issue is here. Having done so, I find that I am
unable to care. (Which is good, because I'm not the target audience
for the distinction :-))

So, to summarise, I can't see that a change would affect me at all. I
mildly favour Tim's position - because Raymond's seems to be based on
practicality for end users (where Tim's is based on convenience for
experts), and I can't see any practical effect on me to Tim's change.

OTOH, if end user impact were the driving force, I'd rather see
Decimal(string) raise an Inexact exception if the string would be
rounded:

>>> # Remember, my argument is that I'd never do the following in
practice, so this is
>>> # solely for a highly unusual edge case!
>>> decimal.getcontext().prec=5

>>> # This confuses me - it silently gives "the wrong" answer in my
mental model.
>>> Decimal("1.23456789") * 2
Decimal("2.4691")

>>> c = decimal.getcontext().copy()
>>> c.traps[decimal.Inexact] = True

>>> # This does what I expect - it tells me that I've done something wrong!
>>> c.create_decimal("1.23456789") * 2
Traceback (most recent call last):
  File "<stdin>", line 1, in ?
  File "C:\Apps\Python24\lib\decimal.py", line 2291, in create_decimal
    return d._fix(self)
  File "C:\Apps\Python24\lib\decimal.py", line 1445, in _fix
    ans = ans._round(prec, context=context)
  File "C:\Apps\Python24\lib\decimal.py", line 1567, in _round
    context._raise_error(Inexact, 'Changed in rounding')
  File "C:\Apps\Python24\lib\decimal.py", line 2215, in _raise_error
    raise error, explanation
decimal.Inexact: Changed in rounding

I hope this helps,
Paul.

From foom at fuhm.net  Sat May 21 16:33:27 2005
From: foom at fuhm.net (James Y Knight)
Date: Sat, 21 May 2005 10:33:27 -0400
Subject: [Python-Dev] PEP 344: Explicit vs. Implicit Chaining
In-Reply-To: <5.1.1.6.0.20050520183332.0349eec0@mail.telecommunity.com>
References: <Pine.LNX.4.58.0505200330240.4932@server1.LFW.org>
	<Pine.LNX.4.58.0505200330240.4932@server1.LFW.org>
	<5.1.1.6.0.20050520183332.0349eec0@mail.telecommunity.com>
Message-ID: <F93DFDC3-9113-4461-A8CA-FFF0EC95E631@fuhm.net>

On May 20, 2005, at 6:37 PM, Phillip J. Eby wrote:
> This only helps if you can get to a debugger.  What if you're  
> reading your web server's error log?

Then you're in trouble anyways because you need the contents of some  
local to figure out what's going on, also.

James

From python-dev at zesty.ca  Sat May 21 17:23:27 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Sat, 21 May 2005 10:23:27 -0500 (CDT)
Subject: [Python-Dev] PEP 344: Explicit vs. Implicit Chaining
In-Reply-To: <F93DFDC3-9113-4461-A8CA-FFF0EC95E631@fuhm.net>
References: <Pine.LNX.4.58.0505200330240.4932@server1.LFW.org>
	<Pine.LNX.4.58.0505200330240.4932@server1.LFW.org>
	<5.1.1.6.0.20050520183332.0349eec0@mail.telecommunity.com>
	<F93DFDC3-9113-4461-A8CA-FFF0EC95E631@fuhm.net>
Message-ID: <Pine.LNX.4.58.0505211023010.4932@server1.LFW.org>

On Sat, 21 May 2005, James Y Knight wrote:
> On May 20, 2005, at 6:37 PM, Phillip J. Eby wrote:
> > This only helps if you can get to a debugger.  What if you're
> > reading your web server's error log?
>
> Then you're in trouble anyways because you need the contents of some
> local to figure out what's going on, also.

Ever used cgitb?


-- ?!ng

From MFC at uk.ibm.com  Sun May 22 11:30:05 2005
From: MFC at uk.ibm.com (Mike Cowlishaw)
Date: Sun, 22 May 2005 10:30:05 +0100
Subject: [Python-Dev] Adventures with Decimal
Message-ID: <OF6FC3D106.14C3615B-ON80257009.0033F648-80257009.00343131@uk.ibm.com>

Several people have pointed me at this interesting thread, and
both Tim and Raymond have sent me summaries of their arguments.
Thank you all!  I see various things I have written have caused
some confusion, for which I apologise.

The 'right' answer might, in fact, depend somewhat on the
programming language, as I'll try and explain below, but let me
first try and summarize the background of the decimal specification
which is on my website at:

  http://www2.hursley.ibm.com/decimal/#arithmetic


Rexx
----
Back in 1979/80, I was writing the Rexx programming language,
which has always had (only) decimal arithmetic.  In 1980, it was
used within IBM in over 40 countries, and had evolved a decimal
arithmetic which worked quite well, but had some rather quirky
arithmetic and rounding rules -- in particular, the result of an
operation had a number of decimal places equal to the larger of
the number of decimal places of its operands.

Hence 1.23 + 1.27 gave 2.50 and 1.230000 + 1.27 gave 2.500000.
This had some consequences that were quite predictable, but
were unexpected by most people.  For example, 1.2 x 1.2 gave 1.4,
and you had to suffix a 0 to one of the operands (easy to do in
Rexx) to get an exact result: 1.2 x 1.20 => 1.44.

By 1981, much of the e-mail and feedback I was getting was related
to various arithmetic quirks like this.  My design strategy for
the language was more-or-less to 'minimise e-mail' (I was getting
350+ every day, as there were no newsgroups or forums then) --
and it was clear that the way to minimise e-mail was to make the
language work the way people expected (not just in arithmetic).

I therefore 'did the research' on arithmetic to find out what it
was that people expected (and it varies in some cases, around the
world), and then changed the arithmetic to match that.  The result
was that e-mail on the subject dropped to almost nothing, and
arithmetic in Rexx became a non-issue: it just did what people
expected.

It's strongest feature is, I think, that "what you see is what
you've got" -- there are no hidden digits, for example.  Indeed,
in at least one Rexx interpreter the numbers are, literally,
character strings, and arithmetic is done directly on those
character strings (with no conversions or alternative internal
representation).

I therefore feel, quite strongly, that the value of a literal is,
and must be, exactly what appears on the paper.  And, in a
language with no constructors (such as Rexx), and unlimited
precision, this is straightforward.  The assignment

  a = 1.10000001

is just that; there's no operation involved, and I would argue
that anyone reading that and knowing the syntax of a Rexx
assignment would expect the variable a to have the exact value of
the literal (that is, "say a" would then display 1.10000001).

The Rexx arithmetic does have the concept of 'context', which
mirrors the way people do calculations on paper -- there are some
implied rules (how many digits to work to, etc.) beyond the sum
that is written down.  This context, in Rexx, "is used to change
the way in which arithmetic operations are carried out", and does
not affect other operations (such as assignment).



Java
----
So what should one do in an object-oriented language, where
numbers are objects?  Java is perhaps a good model, here.  The
Java BigDecimal class originally had only unlimited precision
arithmetic (the results of multiplies just got longer and longer)
and only division had a mechanism to limit (round) the result in
some way, as it must.

By 1997, it became obvious that the original BigDecimal, though
elegant in its simplicity, was hard to use.  We (IBM) proposed
various improvements and built a prototype:

  http://www2.hursley.ibm.com/decimalj/

and this eventually became a formal Java Specification Request:

  http://jcp.org/aboutJava/communityprocess/review/jsr013/index.html

which led to the extensive enhancements in BigDecimal that were
shipped last year in Java 5:

  http://java.sun.com/j2se/1.5.0/docs/api/java/math/BigDecimal.html

In summary, for each operation (such as a.add(b)) a new method was
added which takes a context: a.add(b, context).  The context
supplies the rounding precision and rounding mode.  Since the
arguments to an operation can be of any length (precision), the
rounding rule is simple: the operation is carried out as though to
infinite precision and is then rounded (if necessary).  This rule
avoids double-rounding.

Constructors were not a point of debate.  The constructors in the
original BigDecimal always gave an exact result (even when
constructing from a binary double) so those were not going to
change.  We did, however, almost as an afterthought, add versions
of the constructors that took a context argument.

The model, therefore, is essentially the same as the Rexx one:
what you see is what you get.  In Java, the assignment:

  BigDecimal a = new BigDecimal("1.10000001");

ends up with a having an object with the value you see in the
string, and for it to be otherwise one would have to write:

  BigDecimal a = new BigDecimal("1.10000001", context);

which gives a very nice clue that something may happen to the
value.  This, to me, seems a clean design.


So why does my specification appear to say something different?
---------------------------------------------------------------
Both the languages described so far support arbitrary-length
decimal numbers.  Over the past five years, however, I have been
concentrating more on fixed-length decimals, as in languages such
as C# and as will be in C and C++ and in hardware.

When the representation of a decimal number has a fixed length,
then the nice clean model of a one-to-one mapping of a literal to
the internal representation is no longer always possible.  For
example, the IEEE 754r proposed decimal32 format can represent a
maximum of 7 decimal digits in the significand.  Hence, the
assignment:

  decimal32 d = 1.10000001;

(in some hypothetical C-like language) cannot result in d having
the value shown in the literal.  This is the point where language
history or precedent comes in: some languages might quietly round
at this point, others might give a compile-time warning or error
(my preference, at least for decimal types).  Similar concerns
apply when the conversion to internal form causes overflow or
underflow.

The wording in the specification was intended to allow for these
kinds of behaviors, and to allow for explicit rounding, using a
context, when a string is converted to some internal
representation.  It was not intended to restrict the behavior of
(for example) the Java constructor: one might consider that
constructor to be working with an implied context which has an
infinite precision.  In other words, the specification does not
attempt to define where the context comes from, as this would seem
to be language-dependent.  In Java, any use of a programmer-
supplied context is explicit and visible, and if none is supplied
then the implied context has UNLIMITED precision.  In Rexx, the
context is always implicit -- but there is no 'conversion from
string to number' because numbers _are_ strings.


So what should Python do?
-------------------------
Since your Decimal class has the ability to preserve the value of
a literal exactly, my recommendation is that that should be the
default behavior.  Changing the value supplied as a literal
without some explicit syntax is likely to surprise, given the
knowledge that there are no length restrictions in the class.

My view is that only in the case of a fixed-length destination
or an explicit context might such a rounding be appropriate.

Given that Python has the concept of an implicit decimal context,
I can see why Tim can argue that the implicit context is the
context which applies for a constructor.  However, perhaps you can
define that the implicit context 'only applies to arithmetic
operations', or some such definition, much as in Rexx?

And I should clarify my specification to make it clear that
preserving the value, if possible, is preferable to rounding --
any suggestions for wording?


Mike Cowlishaw

[I'll try and follow this thread in the mailing list for a while,
but I am flying to the USA on Monday so my e-mail access will be
erratic for the next week or so.]

From raymond.hettinger at verizon.net  Mon May 23 04:13:36 2005
From: raymond.hettinger at verizon.net (Raymond Hettinger)
Date: Sun, 22 May 2005 22:13:36 -0400
Subject: [Python-Dev] Decimal FAQ
Message-ID: <016f01c55f3d$077cf280$7d27a044@oemcomputer>

Some of the private email I've received indicates a need for a decimal
FAQ that would shorten the module's learning curve.

A discussion draft follows.


Raymond


-------------------------------------------------------


Q.  It is cumbersome to type decimal.Decimal('1234.5').  Is there a way
to
minimize typing when using the interactive interpreter?

A.  Some users prefer to abbreviate the constructor to just a single
letter:

>>> D = decimal.Decimal
>>> D('1.23') + D('3.45')
Decimal("4.68")


Q.  I'm writing a fixed-point application to two decimal places.
Some inputs have many places and needed to be rounded.  Others
are not supposed to have excess digits and need to be validated.
What methods should I use?

A.  The quantize() method rounds to a fixed number of decimal
places.  If the Inexact trap is set, it is also useful for
validation:

>>> TWOPLACES = Decimal(10) ** -2
>>> # Round to two places
>>> Decimal("3.214").quantize(TWOPLACES)
Decimal("3.21")
>>> # Validate that a number does not exceed two places
>>> Decimal("3.21").quantize(TWOPLACES,
context=Context(traps=[Inexact]))
Decimal("3.21")


Q.  Once I have valid two place inputs, how do I maintain that invariant
throughout an application?

A.  Some operations like addition and subtraction automatically
preserve fixed point.  Others, like multiplication and division,
change the number of decimal places and need to be followed-up with
a quantize() step.


Q.  There are many ways to write express the same value.  The numbers
200, 200.000, 2E2, and .02E+4 all have the same value at various
precisions.
Is there a way to transform these to a single recognizable canonical
value?

A.  The normalize() method maps all equivalent values to a single
representive:

>>> values = map(Decimal, '200 200.000 2E2 .02E+4'.split())
>>> [v.normalize() for v in values]
[Decimal("2E+2"), Decimal("2E+2"), Decimal("2E+2"), Decimal("2E+2")]


Q.  Is there a way to convert a regular float to a Decimal?

A.  Yes, all binary floating point numbers can be exactly expressed as a
Decimal.  An exact conversion may take more precision than intuition
would
suggest, so trapping Inexact will signal a need for more precision:

def floatToDecimal(f):
    "Convert a floating point number to a Decimal with no loss of
information"
    # Transform (exactly) a float to a mantissa (0.5 <= abs(m) < 1.0)
and an
    # exponent.  Double the mantissa until it is an integer.  Use the
integer
    # mantissa and exponent to compute an equivalent Decimal.  If this
cannot
    # be done exactly, then retry with more precision.

    mantissa, exponent = math.frexp(f)
    while mantissa != int(mantissa):
        mantissa *= 2
        exponent -= 1
    mantissa = int(mantissa)
    oldcontext = getcontext()
    setcontext(Context(traps=[Inexact]))
    try:
        while True:
            try:
               return mantissa * Decimal(2) ** exponent
            except Inexact:
                getcontext().prec += 1
    finally:
        setcontext(oldcontext)


Q.  Why isn't the floatToDecimal() routine included in the module?

A.  There is some question about whether it is advisable to mix binary
and
decimal floating point.  Also, its use requires some care to avoid the
representation issues associated with binary floating point:

>>> floatToDecimal(1.1)
Decimal("1.100000000000000088817841970012523233890533447265625")


Q.  I have a complex calculation.  How can I make sure that I haven't
gotten
a spurious result because of insufficient precision or rounding
anomalies.

A.  The decimal module makes it easy to test results.  A best practice
is
to re-run calculations using greater precision and with various rounding
modes.  Widely differing results indicate insufficient precision,
rounding
mode issues, ill-conditioned inputs, or a numerically unstable
algorithm.


Q.  I noticed that context precision is applied to the results of
operations
but not to the inputs.  Is there anything I should watch out for when
mixing
values of different precisions?

A.  Yes.  The principle is all values are considered to be exact and so
is
the arithmetic on those values.  Only the results are rounded.  The
advantage
for inputs is that "what you type is what you get".  A disadvantage is
that
the results can look odd if you forget that the inputs haven't been
rounded:

>>> getcontext().prec = 3
>>> Decimal('3.104') + D('2.104')
Decimal("5.21")
>>> Decimal('3.104') + D('0.000') + D('2.104')
Decimal("5.20")

The solution is either to increase precision or to force rounding of
inputs
using the unary plus operation:

>>> getcontext().prec = 3
>>> +Decimal('1.23456789')
Decimal("1.23")

Alternatively, inputs can be rounded upon creation using the
Context.create_decimal() method:

>>> Context(prec=5, rounding=ROUND_DOWN).create_decimal('1.2345678')
Decimal("1.2345")


Q.  I'm writing an application that tracks measurement units along
with numeric values (for example 1.1 meters and 2.3 grams).  Is a
Decimal subclass the best approach?

A.  Like other numeric types, Decimal objects are dimensionless and
all of its methods are designed around this concept.  To add dimension,
a Decimal subclass would likely need to override every method.  For
example, without an overriding the __add__() method in a Measurement
subclass of Decimal, a calculation like "MeasurementA + MeasurementB"
would return a dimensionless Decimal object instead of another
Measurement object -- the units would be lost.

A simple alternative is to construct record tuples such as
(Decimal("1.1"),
"meters").  This allows direct use of existing decimal methods:
if a[1] == b[1]: return (a[0]+b[0], a[1]).

A more versatile approach is to create a separate class with Decimal
objects as attributes (working in a "has-a" capacity rather than an
"is-a" capacity) and then delegating the arithmetic to the Decimal
class.



From t-meyer at ihug.co.nz  Mon May 23 04:30:28 2005
From: t-meyer at ihug.co.nz (Tony Meyer)
Date: Mon, 23 May 2005 14:30:28 +1200
Subject: [Python-Dev] Decimal FAQ
In-Reply-To: <ECBA357DDED63B4995F5C1F5CBE5B1E802E5A24C@its-xchg4.massey.ac.nz>
Message-ID: <ECBA357DDED63B4995F5C1F5CBE5B1E801DB01DE@its-xchg4.massey.ac.nz>

> Q.  I'm writing a fixed-point application to two decimal places.
> Some inputs have many places and needed to be rounded.  Others
> are not supposed to have excess digits and need to be validated.
> What methods should I use?
> 
> A.  The quantize() method rounds to a fixed number of decimal
> places.  If the Inexact trap is set, it is also useful for
> validation:
> 
> >>> TWOPLACES = Decimal(10) ** -2
> >>> # Round to two places
> >>> Decimal("3.214").quantize(TWOPLACES)
> Decimal("3.21")
> >>> # Validate that a number does not exceed two places
> >>> Decimal("3.21").quantize(TWOPLACES,
> context=Context(traps=[Inexact]))
> Decimal("3.21")

I think an example of what happens when it does exceed two places would make
this example clearer.  For example, adding this to the end of that:

>>> Decimal("3.214").quantize(TWOPLACES, context=Context(traps=[Inexact]))
Traceback (most recent call last):
[...]
Inexact: Changed in rounding

=Tony.Meyer


From greg.ewing at canterbury.ac.nz  Mon May 23 04:32:53 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 23 May 2005 14:32:53 +1200
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <000101c55e2d$2431a620$7d27a044@oemcomputer>
References: <000101c55e2d$2431a620$7d27a044@oemcomputer>
Message-ID: <429140D5.7010908@canterbury.ac.nz>

Raymond Hettinger wrote:

> IMO, user input (or
> the full numeric strings in a text data file) is sacred and presumably
> done for a reason -- the explicitly requested digits should not be
> throw-away without good reason.

I still don't understand what's so special about the
input phase that it should be treated sacredly, while
happily desecrating the result of any *other* operation.

To my mind, if you were really serious about treating
precision as sacred, the result of every operation
would be the greater of the precisions of the
inputs. That's what happens in C or Fortran - you
add two floats and you get a float; you add a float
and a double and you get a double; etc.

> Truncating/rounding a
> literal at creation time doesn't work well when you are going to be
> using those values several times, each with a different precision.

This won't be a problem if you recreate the values
from strings each time. You're going to have to be
careful anyway, e.g. if you calculate some constants,
such as degreesToRadians = pi/180, you'll have to
make sure that you recalculate them with the desired
precision before rerunning the algorithm.

> Remember, the design documents for the spec state a general principle:
> the digits of a decimal value are *not* significands, rather they are
> exact and all arithmetic on the is exact with the *result* being subject
> to optional rounding.

I don't see how this is relevant, because digits in
a character string are not "digits of a decimal value"
according to what we are meaning by "decimal value"
(i.e. an instance of Decimal). In other words, this
principle only applies *after* we have constructed a
Decimal instance.

-- 
Greg Ewing, Computer Science Dept, +--------------------------------------+
University of Canterbury,	   | A citizen of NewZealandCorp, a	  |
Christchurch, New Zealand	   | wholly-owned subsidiary of USA Inc.  |
greg.ewing at canterbury.ac.nz	   +--------------------------------------+

From greg.ewing at canterbury.ac.nz  Mon May 23 10:17:41 2005
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 23 May 2005 20:17:41 +1200
Subject: [Python-Dev] Adventures with Decimal
References: <000701c55f4e$9866f820$7d27a044@oemcomputer>
Message-ID: <429191A5.5040002@canterbury.ac.nz>

Raymond Hettinger wrote:
> Did you see Mike Cowlishaw's posting where he described why he took our
> current position (wysiwig input) in the spec, in Java's BigDecimal, and
> in Rexx's numeric model?

Yes, it appears that you have channeled him correctly
on that point, and Tim hasn't. :-)

But I also found it interesting that, while the spec
requires the existence of a context for each operation,
it apparently *doesn't* mandate that it must be kept
in a global variable, which is the part that makes me
uncomfortable.

Was there any debate about this choice when the Decimal
module was being designed? It seems to go against
EIBTI, and even against Mr. Cowlishaw's own desire
for WYSIWIG, because WYG depends not only on what
you can see, but a piece of hidden state as well.

Greg


From ncoghlan at gmail.com  Mon May 23 11:21:44 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 23 May 2005 19:21:44 +1000
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <003201c55ddf$c7e83020$871dc797@oemcomputer>
References: <003201c55ddf$c7e83020$871dc797@oemcomputer>
Message-ID: <4291A0A8.2080409@gmail.com>

Raymond Hettinger wrote:
>>Py> decimal.Decimal("a", context)
>>Decimal("NaN")
>>
>>I'm tempted to suggest deprecating the feature, and say if you want
>>invalid
>>strings to produce NaN, use the create_decimal() method of Context
>>objects.
> 
> 
> The standard does require a NaN to be produced.

In that case, I'd prefer to see the behaviour of the Decimal constructor 
(InvalidOperation exception, or NaN result) always governed by the current context.

If you want to use a different context (either to limit the precision, or to 
alter the way malformed strings are handled), you invoke creation via that 
context, not via the standard constructor.

> Unless something is shown to be wrong with the current implementation, I
> don't think we should be in a hurry to make a post-release change.

The fact that the BDFL (and others, me included) were at least temporarily 
confused by the ability to pass a context in to the constructor suggests there 
is an interface problem here.

The thing that appears to be confusing is that you *can* pass a context in to 
the Decimal constructor, but that context is then almost completely ignored. It 
gives me TOOWTDI concerns,  even though passing the context to the constructor 
does, in fact, differ slightly from using the create_decimal() method (the 
former does not apply the precision, as Guido discovered).

Cheers,
NIck.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From mcherm at mcherm.com  Mon May 23 14:10:45 2005
From: mcherm at mcherm.com (Michael Chermside)
Date: Mon, 23 May 2005 05:10:45 -0700
Subject: [Python-Dev] Adventures with Decimal
Message-ID: <20050523051045.m525wr7pg65pss0o@login.werra.lunarpages.com>

I'd like to respond to a few people, I'll start with Greg Ewing:

Greg writes:
> I don't see how it
> helps significantly to have just the very first
> step -- turning the input into numbers -- be
> exempt from this behaviour. If anything, people
> are going to be even more confused. "But it
> can obviously cope with 1.1000000000000000001,
> so why does it give the wrong answer when I add
> something to it?"

As I see it, there is a meaningful distinction between constructing
Decimal instances and performing arithmatic with them. I even think
this distinction is easy to explain to users, even beginners. See,
it's all about the program "doing what you tell it to".

If you type in this:
    x = decimal.Decimal("1.100000000000000000000000000003")
as a literal in your program, then you clearly intended for that
last decimal place to mean something. By contrast, if you were to
try passing a float to the Decimal constructor, it would raise an
exception expressly to protect users from "accidently" entering
something slightly off from what they meant.

On the other hand, in Python, if you type this:
    z = x + y
then what it does is completely dependent on the types of x and y.
In the case of Decimal objects, it performs a "perfect" arithmetic
operation then rounds to the current precision.

The simple explanation for users is "Context affects *operations*,
but not *instances*." This explains the behavior of operations, of
constructors, and also explains the fact that changing precision
doesn't affect the precision of existing instances. And it's only
6 words long.

> But I also found it interesting that, while the spec
> requires the existence of a context for each operation,
> it apparently *doesn't* mandate that it must be kept
> in a global variable, which is the part that makes me
> uncomfortable.
>
> Was there any debate about this choice when the Decimal
> module was being designed?

It shouldn't make you uncomfortable. Storing something in a global
variable is a BAD idea... it is just begging for threads to mess
each other up. The decimal module avoided this by storing a SEPARATE
context for each thread, so different threads won't interfere with
each other. And there *is* a means for easy access to the context
objects... decimal.getcontext().

Yes, it was debated, and the debate led to changing from a global
variable to the existing arrangement.

------
As long as I'm writing, let me echo Nick Coghlan's point:
> The fact that the BDFL (and others, me included) were at least temporarily
> confused by the ability to pass a context in to the constructor suggests there
> is an interface problem here.
>
> The thing that appears to be confusing is that you *can* pass a context in to
> the Decimal constructor, but that context is then almost completely ignored.

Yeah... I agree. If you provide a Context, it should be used. I favor changing
the behavior of the constructor as follows:

     def Decimal(data, context=None):
         result = Existing_Version_Of_Decimal(data)
         if context is None:
             result = +result
         return result

In other words, make FULL use of the context in the constructor if a context
is provided, but make NO use of the thread context when no context is
provided.

------
One final point... Thanks to Mike Cowlishaw for chiming in with a detailed
and well-considered explanation of his thoughts on the matter.

-- Michael Chermside


From mcherm at mcherm.com  Mon May 23 14:34:39 2005
From: mcherm at mcherm.com (Michael Chermside)
Date: Mon, 23 May 2005 05:34:39 -0700
Subject: [Python-Dev] PEP 344: Explicit vs. Implicit Chaining
Message-ID: <20050523053439.2aee56vw7vkkk0kc@login.werra.lunarpages.com>

James Knight writes:
> I still don't see why people think the python interpreter should be
> automatically providing __context__. To me it seems like it'll just
> clutter things up for no good reason. If you really want the other
> exception, you can access it via the local variable in the frame
> where it was first caught.

No you can't, because you didn't know the second exception was
going to happen! I write something like this:

    db_connection = get_db_connection()
    try:
        do_some_stuff(db_connection)
    except DatabaseException, err:
        log_the_problem(err)
        cleanup(db_connection)

If something goes wrong inside of do_some_stuff, I enter the
exception handler. But then if an error occurs within
log_the_problem() or cleanup(), then I lose the original exception.
It's just GONE. I didn't expect log_the_problem() or cleanup() to
fail, but sometimes things DO fail.

An example of this happens to me in in Java (which has the same
problem. I have code like this:

    db_connection = get_db_connection()
    try:
        do_some_stuff(db_connection)
    finally:
        db_connection.close()

For instance, when I want to do
unit testing, I create a mock database connection that raises
an exception if you don't use it as the test expects. So I get
exceptions like this all the time:

    Error: did not expect call to "close()"

Of course, what REALLY happened was that we tried to update a row
that didn't exist, which created an exception:

    Error: tried to update row with key "100", but it does not exist.

But then it entered the finally clause, and tried to close the
connection. That wasn't expected either, and the new exception
replaces the old one... and we lose information about what REALLY
caused the problem.

In Java, I had to fix this by making my mock objects very smart.
They have to keep track of whether any problem has occurred during
this test (in any of the cooperating mock objects) and if so, then
they have to re-report the original problem whenever something new
goes wrong. This is the only way I've found to work around the
problem in Java. Wouldn't it be nice if Python could do better?

-- Michael Chermside

From pje at telecommunity.com  Mon May 23 15:06:17 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Mon, 23 May 2005 09:06:17 -0400
Subject: [Python-Dev] __trace__? (was Re: PEP 344: Explicit vs. Implicit
 Chaining
In-Reply-To: <F93DFDC3-9113-4461-A8CA-FFF0EC95E631@fuhm.net>
References: <5.1.1.6.0.20050520183332.0349eec0@mail.telecommunity.com>
	<Pine.LNX.4.58.0505200330240.4932@server1.LFW.org>
	<Pine.LNX.4.58.0505200330240.4932@server1.LFW.org>
	<5.1.1.6.0.20050520183332.0349eec0@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050523084843.01d89470@mail.telecommunity.com>

At 10:33 AM 5/21/2005 -0400, James Y Knight wrote:
>On May 20, 2005, at 6:37 PM, Phillip J. Eby wrote:
> > This only helps if you can get to a debugger.  What if you're
> > reading your web server's error log?
>
>Then you're in trouble anyways because you need the contents of some
>local to figure out what's going on, also.

Actually, this reminds me of something...  I've often found that tracebacks 
listing the source code are less than informative for developers using a 
library.  I've been thinking about creating a traceback formatter that 
would instead display more useful trace information, but not the 
super-verbose information dumped by cgitb, nor the cryptic and wasteful 
__traceback_info__ of Zope.

Specifically, I was thinking I would have statements like this:

     __trace__ = "Computing the value of %(attrName)s"

embedded in library code.  The traceback formatter would check each frame 
for a local named __trace__, and if present, use it as a format to display 
the frame's locals.  This information would replace only the source code 
line, so you'd still get line and file information in the traceback, but 
you'd see a summary of what that code was currently doing.  (If trying to 
format the trace information produces an error, the formatter should fall 
back to displaying the source line, and perhaps emit some information about 
the broken __trace__ -- maybe just display the original __trace__ string.)

As long as we're proposing traceback formatting enhancements, I'd like to 
suggest this one.  A sufficiently smart compiler+runtime could also 
probably optimize away __trace__ assignments, replacing them with a table 
similar to co_lnotab, but even without such a compiler, a __trace__ 
assignment is just a LOAD_CONST and STORE_FAST; not much overhead at all.

Anyway, judicious use of __trace__ in library code (including the standard 
library) would make tracebacks much more comprehensible.  You can think of 
them as docstrings for errors.  :)

Interestingly, you could perhaps implement context exceptions in terms of 
__trace__, e.g.:

     try:
         doSomething()
     except Exception, v:
         tb = v.__traceback__
         __trace__ = "Handling exception:\n%(v)s\n%(tb)s"
         # etc.

So, you'd get the formatting of the context exception embedded in the 
traceback of the error in the handler.


From aahz at pythoncraft.com  Mon May 23 15:18:21 2005
From: aahz at pythoncraft.com (Aahz)
Date: Mon, 23 May 2005 06:18:21 -0700
Subject: [Python-Dev] Adventures with Decimal
In-Reply-To: <429191A5.5040002@canterbury.ac.nz>
References: <000701c55f4e$9866f820$7d27a044@oemcomputer>
	<429191A5.5040002@canterbury.ac.nz>
Message-ID: <20050523131820.GA12079@panix.com>

On Mon, May 23, 2005, Greg Ewing wrote:
>
> But I also found it interesting that, while the spec requires the
> existence of a context for each operation, it apparently *doesn't*
> mandate that it must be kept in a global variable, which is the part
> that makes me uncomfortable.
>
> Was there any debate about this choice when the Decimal module was
> being designed?

Absolutely.  First of all, as Michael Chermside pointed out, it's
actually thread-local.  But even without that, we were still prepared to
release Decimal with global context.  Look at Java: you have to specify
the context manually with every operation.  It was a critical design
criterion for Python that this be legal::

    >>> x = Decimal('1.2')
    >>> y = Decimal('1.4')
    >>> x*y
    Decimal("1.68")

IOW, constructing Decimal instances might be a bit painful, but *using*
them would be utterly simple.
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"The only problem with Microsoft is they just have no taste." --Steve Jobs

From python-dev at zesty.ca  Tue May 24 08:22:06 2005
From: python-dev at zesty.ca (Ka-Ping Yee)
Date: Tue, 24 May 2005 01:22:06 -0500 (CDT)
Subject: [Python-Dev] AST manipulation and source code generation
Message-ID: <Pine.LNX.4.58.0505240117240.4932@server1.LFW.org>

Would there be any interest in extending the compiler package with tools
for AST transformations and for emitting Python source code from ASTs?

I was experimenting with possible translations for exception chaining
and wanted to run some automated tests, so i started playing around
with the compiler package to do source-to-source transformations.
Then i started working on a way to do template-based substitution of
ASTs and a way to spit source code back out, and i'm wondering if
that might be good for experimenting with future Python features.

(If there's already stuff out there for doing this, let me know --
i don't intend to duplicate existing work.)


-- ?!ng

From jhylton at gmail.com  Tue May 24 15:56:15 2005
From: jhylton at gmail.com (Jeremy Hylton)
Date: Tue, 24 May 2005 09:56:15 -0400
Subject: [Python-Dev] AST manipulation and source code generation
In-Reply-To: <Pine.LNX.4.58.0505240117240.4932@server1.LFW.org>
References: <Pine.LNX.4.58.0505240117240.4932@server1.LFW.org>
Message-ID: <e8bf7a53050524065613ae1192@mail.gmail.com>

On 5/24/05, Ka-Ping Yee <python-dev at zesty.ca> wrote:
> Would there be any interest in extending the compiler package with tools
> for AST transformations and for emitting Python source code from ASTs?

Sure.  Eventually, we'll have to figure out how to unify the compiler
package AST and the ast-branch AST, but don't let that delay you now.

> I was experimenting with possible translations for exception chaining
> and wanted to run some automated tests, so i started playing around
> with the compiler package to do source-to-source transformations.
> Then i started working on a way to do template-based substitution of
> ASTs and a way to spit source code back out, and i'm wondering if
> that might be good for experimenting with future Python features.
> 
> (If there's already stuff out there for doing this, let me know --
> i don't intend to duplicate existing work.)

I don't know of any existing work, but it certainly sounds useful.

Jeremy

From bac at OCF.Berkeley.EDU  Wed May 25 01:11:34 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Tue, 24 May 2005 16:11:34 -0700
Subject: [Python-Dev] Localized Type Inference of Atomic Types in Python
Message-ID: <4293B4A6.8030002@ocf.berkeley.edu>

My thesis, "Localized Type Inference of Atomic Types in Python", was
successfully defended today for my MS in Computer Science at the California
Polytechnic State University, San Luis Obispo.  With that stamp of approval I
am releasing it to the world.  You can grab a copy at
http://www.drifty.org/thesis.pdf .

For those of you who attended my talk at PyCon 2005 this is the thesis that
stemmed from the presented data.

As of this exact moment I am not planning to release the source code mainly
because it's a mess, I am not in the mood to pull the patches together, and the
last thing I want happening is people finding mistakes in the code.  =)  But if
enough people request the source I will take the time to generate a tar.bz2
file of patches against the 2.3.4 source release and put them up somewhere.

Below is the abstract culled directly from the thesis itself.

-Brett C.

---------------------------------
ABSTRACT

Types serve multiple purposes in programming.  One such purpose is in providing
information to allow for improved performance.  Unfortunately, specifying the
types of all variables in a program does not always fit within the design of a
programming language.

Python is a language where specifying types does not fit within the language
design.  An open source, dynamic programming language, Python does not support
type specifications of variables.  This limits the opportunities in Python for
performance optimizations based on type information  compared to languages that
do allow or require the specification of types.

Type inference is a way to derive the needed type information for optimizations
based on types without requiring type specifications in the source code of a
program.  By inferring the types of variables based on flow control and other
hints in a program, the type information can be derived and used in a
constructive manner.

This thesis is an exploration of implementing a type inference algorithm for
Python without changing the semantics of the language.  It also explores the
benefit of adding type annotations to method calls in order to garner more type
information.

From tjreedy at udel.edu  Wed May 25 01:24:04 2005
From: tjreedy at udel.edu (Terry Reedy)
Date: Tue, 24 May 2005 19:24:04 -0400
Subject: [Python-Dev] Localized Type Inference of Atomic Types in Python
References: <4293B4A6.8030002@ocf.berkeley.edu>
Message-ID: <d70cvc$fhs$1@sea.gmane.org>


"Brett C." <bac at OCF.Berkeley.EDU> wrote in message 
news:4293B4A6.8030002 at ocf.berkeley.edu...
> My thesis, "Localized Type Inference of Atomic Types in Python", was
> successfully defended today for my MS in Computer Science at the 
> California
> Polytechnic State University, San Luis Obispo.

Woo hoo.  Congratulations.

Terry J. Reedy 




From facundobatista at gmail.com  Wed May 25 03:56:19 2005
From: facundobatista at gmail.com (Facundo Batista)
Date: Tue, 24 May 2005 22:56:19 -0300
Subject: [Python-Dev] Decimal FAQ
In-Reply-To: <016f01c55f3d$077cf280$7d27a044@oemcomputer>
References: <016f01c55f3d$077cf280$7d27a044@oemcomputer>
Message-ID: <e04bdf31050524185611d0382a@mail.gmail.com>

On 5/22/05, Raymond Hettinger <raymond.hettinger at verizon.net> wrote:

> Some of the private email I've received indicates a need for a decimal
> FAQ that would shorten the module's learning curve.

Nice FAQ, but where we should put it? It's kinda for advanced Decimal users...


> A.  Some users prefer to abbreviate the constructor to just a single
> letter:
> 
> >>> D = decimal.Decimal
> >>> D('1.23') + D('3.45')
> Decimal("4.68")

I'd add something like "However you'll note that this kind of use is
in examples, not in everyday code".


> >>> TWOPLACES = Decimal(10) ** -2

I always wrote it Decimal("0.01"), I think it's more clear.


> A more versatile approach is to create a separate class with Decimal
> objects as attributes (working in a "has-a" capacity rather than an
> "is-a" capacity) and then delegating the arithmetic to the Decimal
> class.

Currency will use the "has-a" method, ;)

.    Facundo

Blog: http://www.taniquetil.com.ar/plog/
PyAr: http://www.python.org/ar/

From arigo at tunes.org  Wed May 25 12:08:09 2005
From: arigo at tunes.org (Armin Rigo)
Date: Wed, 25 May 2005 12:08:09 +0200
Subject: [Python-Dev] Localized Type Inference of Atomic Types in Python
In-Reply-To: <4293B4A6.8030002@ocf.berkeley.edu>
References: <4293B4A6.8030002@ocf.berkeley.edu>
Message-ID: <20050525100809.GA7257@code1.codespeak.net>

Hi Brett,

On Tue, May 24, 2005 at 04:11:34PM -0700, Brett C. wrote:
> My thesis, "Localized Type Inference of Atomic Types in Python", was
> successfully defended today for my MS in Computer Science at the California
> Polytechnic State University, San Luis Obispo.

Congratulations !

Nitpickingly... thanks for the references to Psyco, though I should add
that Psyco has been supporting more than just ints and strings since
shortly after my first e-mail to python-dev about it (in 2001 I think)
:-)  it actually knows more or less about all common built-in types.


A bientot,

Armin

From andy at andygross.org  Wed May 25 19:31:34 2005
From: andy at andygross.org (Andy Gross)
Date: Wed, 25 May 2005 13:31:34 -0400
Subject: [Python-Dev] AST manipulation and source code generation
In-Reply-To: <e8bf7a53050524065613ae1192@mail.gmail.com>
References: <Pine.LNX.4.58.0505240117240.4932@server1.LFW.org>
	<e8bf7a53050524065613ae1192@mail.gmail.com>
Message-ID: <6F7E9530-B8EF-47DA-9A4E-703EA34781D1@andygross.org>


I wrote something like this (called pyunparse) a little while ago.   
It's not the cleanest code in the world, but it worked for my  
original use case (debugging Logix, which uses python ASTs as an IR):

http://www.pycs.net/users/0000445/stories/7.html

Cheers,

/arg




On May 24, 2005, at 9:56 AM, Jeremy Hylton wrote:


> On 5/24/05, Ka-Ping Yee <python-dev at zesty.ca> wrote:
>
>
>> Would there be any interest in extending the compiler package with  
>> tools
>> for AST transformations and for emitting Python source code from  
>> ASTs?
>>
>>
>
> Sure.  Eventually, we'll have to figure out how to unify the compiler
> package AST and the ast-branch AST, but don't let that delay you now.
>
>
>
>> I was experimenting with possible translations for exception chaining
>> and wanted to run some automated tests, so i started playing around
>> with the compiler package to do source-to-source transformations.
>> Then i started working on a way to do template-based substitution of
>> ASTs and a way to spit source code back out, and i'm wondering if
>> that might be good for experimenting with future Python features.
>>
>> (If there's already stuff out there for doing this, let me know --
>> i don't intend to duplicate existing work.)
>>
>>
>
> I don't know of any existing work, but it certainly sounds useful.
>
> Jeremy
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: http://mail.python.org/mailman/options/python-dev/andy% 
> 40andygross.org
>
>



From chad at zetaweb.com  Wed May 25 20:31:34 2005
From: chad at zetaweb.com (Chad Whitacre)
Date: Wed, 25 May 2005 14:31:34 -0400
Subject: [Python-Dev] AST manipulation and source code generation
In-Reply-To: <6F7E9530-B8EF-47DA-9A4E-703EA34781D1@andygross.org>
References: <Pine.LNX.4.58.0505240117240.4932@server1.LFW.org>	<e8bf7a53050524065613ae1192@mail.gmail.com>
	<6F7E9530-B8EF-47DA-9A4E-703EA34781D1@andygross.org>
Message-ID: <d72gac$n5l$1@sea.gmane.org>

Ka-Ping,

FWIW, I've also got an implementation, which is based on the parser 
module rather than the compiler module. Much simpler, imo, but 
whitespace isn't preserved (could be perhaps?).

Anyway, take it or leave it. Links follow.


chad

-----

Subversion repository:
   http://svn.zetadev.com/repos/public/ASTutils/

The relevant method is 'ast2text' in ASTutils.py:
   http://svn.zetadev.com/repos/public/ASTutils/tags/0.2.0/ASTutils.py

API documentation for this method:
http://www.zetadev.com/software/ASTutils/latest/api/public/ASTutils.ASTutils-class.html#ast2text

API documentation root:
   http://www.zetadev.com/software/ASTutils/latest/api/


From bac at OCF.Berkeley.EDU  Wed May 25 21:40:48 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Wed, 25 May 2005 12:40:48 -0700
Subject: [Python-Dev] Localized Type Inference of Atomic Types in Python
In-Reply-To: <20050525100809.GA7257@code1.codespeak.net>
References: <4293B4A6.8030002@ocf.berkeley.edu>
	<20050525100809.GA7257@code1.codespeak.net>
Message-ID: <4294D4C0.7090405@ocf.berkeley.edu>

Armin Rigo wrote:
> Hi Brett,
> 
> On Tue, May 24, 2005 at 04:11:34PM -0700, Brett C. wrote:
> 
>>My thesis, "Localized Type Inference of Atomic Types in Python", was
>>successfully defended today for my MS in Computer Science at the California
>>Polytechnic State University, San Luis Obispo.
> 
> 
> Congratulations !
> 
> Nitpickingly... thanks for the references to Psyco, though I should add
> that Psyco has been supporting more than just ints and strings since
> shortly after my first e-mail to python-dev about it (in 2001 I think)
> :-)  it actually knows more or less about all common built-in types.
> 

Crap, sorry!  That is what I get for taking someone's word instead of digging
into it myself.

-Brett

From kbk at shore.net  Wed May 25 07:02:19 2005
From: kbk at shore.net (Kurt B. Kaiser)
Date: Wed, 25 May 2005 01:02:19 -0400 (EDT)
Subject: [Python-Dev] Weekly Python Patch/Bug Summary
Message-ID: <200505250502.j4P52JB1001530@bayview.thirdcreek.com>

Patch / Bug Summary
___________________

Patches :  342 open ( +3) /  2839 closed ( +1) /  3181 total ( +4)
Bugs    :  936 open ( -2) /  4974 closed (+12) /  5910 total (+10)
RFE     :  189 open ( +2) /   159 closed ( +2) /   348 total ( +4)

New / Reopened Patches
______________________

optparse documentation bug fixes  (2005-05-18)
       http://python.org/sf/1204347  opened by  Barry A. Warsaw

Bugfix for signal-handler on x64 Platform  (2005-05-20)
       http://python.org/sf/1205436  opened by  Andr? Fritzsche

updates for the compiler package  (2005-05-21)
       http://python.org/sf/1206077  opened by  Stelios

An URL for UnicodeData File Format 3.2 has changed.  (2005-05-24)
       http://python.org/sf/1207985  opened by  Darek Suchojad

Patches Closed
______________

workaround deprecated ostat structure in <sys/stat.h>  (2005-05-17)
       http://python.org/sf/1203329  closed by  loewis

New / Reopened Bugs
___________________

Documentation error?  (2005-05-18)
       http://python.org/sf/1204734  opened by  John Eikenberry

urllib has spurious print statement  (2005-05-20)
       http://python.org/sf/1205544  opened by  Stuart Wray

Compile fails on Darwin8 with --with-cxx=g++  (2005-05-20)
       http://python.org/sf/1205568  opened by  Robert M. Zigweid

wrong location for math lib with --prefix  (2005-05-20)
       http://python.org/sf/1205736  opened by  Thomas Richter

IDLE 1.0.5 (Python 2.3.5) crashes under Windows  (2005-05-21)
       http://python.org/sf/1206232  opened by  Torsten Bronger

weakref cannot handle bound methods (in contrast to docu)  (2005-05-22)
       http://python.org/sf/1206537  opened by  Raik Gruenberg

class property fset not working  (2005-05-24)
       http://python.org/sf/1207379  opened by  Master_Jaf

installer ignores changed installation directory  (2005-05-24)
       http://python.org/sf/1207466  opened by  Blubb Fallo

Issue in grammar  (2005-05-24)
       http://python.org/sf/1207501  opened by  venkat manian

Issue in grammar  (2005-05-24)
CLOSED http://python.org/sf/1207509  opened by  venkat manian

Bugs Closed
___________

Problem with recursion in dict (crash with core dump)  (2005-05-13)
       http://python.org/sf/1201456  closed by  vys

Windows msi installer fails on virtual drives  (2005-05-12)
       http://python.org/sf/1200287  closed by  loewis

urllib2 authentication redirection error(?)  (2004-11-21)
       http://python.org/sf/1070735  closed by  allanbwilson

No documentation for urllib2 exception classes  (2004-04-29)
       http://python.org/sf/944407  closed by  fresh

file("foo", "wU") is silently accepted  (2004-06-05)
       http://python.org/sf/967182  closed by  montanaro

UnboundLocalError in cgitb.py  (2003-12-16)
       http://python.org/sf/861340  closed by  montanaro

Python 2.4.1 Installer ended prematurely  (2005-05-11)
       http://python.org/sf/1199947  closed by  loewis

xml.dom.minidom.Node.removeChild() doesn't remove  (2005-03-06)
       http://python.org/sf/1157901  closed by  mkempka

csv writer bug on windows  (2004-04-29)
       http://python.org/sf/944890  closed by  montanaro

Importing anydbm generates exception if _bsddb unavailable  (2003-05-02)
       http://python.org/sf/731501  closed by  montanaro

Explicit interp reference during build fails  (2003-07-08)
       http://python.org/sf/768068  closed by  montanaro

Issue in grammar  (2005-05-24)
       http://python.org/sf/1207509  closed by  mwh

New / Reopened RFE
__________________

Let shift operators take any integer value  (2005-05-19)
       http://python.org/sf/1205239  opened by  David Albert Torpey

Right Click Context Menu  (2005-05-24)
       http://python.org/sf/1207589  opened by  Mike Foord

Clipboard Cleared on Close  (2005-05-24)
       http://python.org/sf/1207592  opened by  Mike Foord

Bottom Scroll Bar  (2005-05-24)
       http://python.org/sf/1207613  opened by  Mike Foord

RFE Closed
__________

enhancing os.chown functionality  (2005-05-12)
       http://python.org/sf/1200804  closed by  loewis

"replace" function should accept lists.  (2005-04-17)
       http://python.org/sf/1184678  closed by  loewis


From Sylvain.Thenault at logilab.fr  Thu May 26 09:58:10 2005
From: Sylvain.Thenault at logilab.fr (Sylvain =?iso-8859-1?Q?Th=E9nault?=)
Date: Thu, 26 May 2005 09:58:10 +0200
Subject: [Python-Dev] AST manipulation and source code generation
In-Reply-To: <6F7E9530-B8EF-47DA-9A4E-703EA34781D1@andygross.org>
References: <Pine.LNX.4.58.0505240117240.4932@server1.LFW.org>
	<e8bf7a53050524065613ae1192@mail.gmail.com>
	<6F7E9530-B8EF-47DA-9A4E-703EA34781D1@andygross.org>
Message-ID: <20050526075810.GB3936@logilab.fr>

> > On 5/24/05, Ka-Ping Yee <python-dev at zesty.ca> wrote:
> >
> >
> >> Would there be any interest in extending the compiler package with  
> >> tools
> >> for AST transformations and for emitting Python source code from  
> >> ASTs?

the astng package from logilab's common library [1] extends compiler AST
nodes with a bunch of methods, including a as_string method on each node
allowing to regenerate a python source code from an ast (other methods
are mainly to ease navigation in the tree or to extract higher level
information from it). Currently it's implemented as a node's method
instead of using a visitor pattern or the like, but that would be easily
done.

[1] http://www.logilab.org/projects/common/

-- 
Sylvain Th?nault                               LOGILAB, Paris (France).

http://www.logilab.com   http://www.logilab.fr  http://www.logilab.org


From chad at zetaweb.com  Thu May 26 14:46:50 2005
From: chad at zetaweb.com (Chad Whitacre)
Date: Thu, 26 May 2005 08:46:50 -0400
Subject: [Python-Dev] AST manipulation and source code generation
In-Reply-To: <Pine.LNX.4.58.0505240117240.4932@server1.LFW.org>
References: <Pine.LNX.4.58.0505240117240.4932@server1.LFW.org>
Message-ID: <4295C53A.50402@zetaweb.com>

> Would there be any interest in extending the compiler package with tools
> for AST transformations and for emitting Python source code from ASTs?

Heh, so I guess the answer is "yes."

BTW, how does the concept of AST transformations relate to the concept 
of (Lisp) macros? Am I right to think that they are similar?



chad


From jhylton at gmail.com  Thu May 26 15:07:30 2005
From: jhylton at gmail.com (Jeremy Hylton)
Date: Thu, 26 May 2005 09:07:30 -0400
Subject: [Python-Dev] AST manipulation and source code generation
In-Reply-To: <4295C53A.50402@zetaweb.com>
References: <Pine.LNX.4.58.0505240117240.4932@server1.LFW.org>
	<4295C53A.50402@zetaweb.com>
Message-ID: <e8bf7a530505260607f00d30f@mail.gmail.com>

On 5/26/05, Chad Whitacre <chad at zetaweb.com> wrote:
> > Would there be any interest in extending the compiler package with tools
> > for AST transformations and for emitting Python source code from ASTs?
> 
> Heh, so I guess the answer is "yes."
> 
> BTW, how does the concept of AST transformations relate to the concept
> of (Lisp) macros? Am I right to think that they are similar?

I think they are similar, but two key differences are:

 - An AST transformation can transform existing syntax but doesn't allow you
   to create new syntax.

 - An AST transformation has to be explicitly invoked.  A macro is
part of the language
   proper and has a semantics for how and when macros are evaluated.

Jeremy

From chad at zetaweb.com  Thu May 26 15:11:49 2005
From: chad at zetaweb.com (Chad Whitacre)
Date: Thu, 26 May 2005 09:11:49 -0400
Subject: [Python-Dev] AST manipulation and source code generation
In-Reply-To: <e8bf7a530505260607f00d30f@mail.gmail.com>
References: <Pine.LNX.4.58.0505240117240.4932@server1.LFW.org>	<4295C53A.50402@zetaweb.com>
	<e8bf7a530505260607f00d30f@mail.gmail.com>
Message-ID: <d74hus$3i6$1@sea.gmane.org>

Thanks Jeremy. Also wandered off-list w/ Ka-Ping; posting here for 
posterity.


chad

-----

chad: BTW, how does the concept of AST transformations relate to the 
concept of (Lisp) macros? Am I right to think that they are similar?

?!ng: Absolutely.  In terms of mechanism, they're basically the same;
the main difference is that in Lisp, the transformations are a part
of the core language definition.

?!ng: Well, i should refine that a bit to say that the Lisp macro system
is a little more specific.  Whereas AST transformations in Python
are open-ended (you could generate any result you want), the key
interesting property of Lisp macros is that they are constrained
to be "safe", in the sense that the bindings of variable names are
always preserved.

chad: Hmmm ... I don't follow python-dev closely but hasn't there been 
resistance to macros in Python? Are we saying macros may be a good idea 
after all?

?!ng: resistance -> Yes.
?!ng: good idea -> Not really.  AST transformations are useful for 
experimenting with the language, but i don't think there is much 
enthusiasm for making these transformations a normal part of the way 
most programs are written.


From jhylton at gmail.com  Thu May 26 15:28:37 2005
From: jhylton at gmail.com (Jeremy Hylton)
Date: Thu, 26 May 2005 09:28:37 -0400
Subject: [Python-Dev] AST manipulation and source code generation
In-Reply-To: <4295CAAF.6040502@zetaweb.com>
References: <Pine.LNX.4.58.0505240117240.4932@server1.LFW.org>
	<4295C53A.50402@zetaweb.com>
	<e8bf7a530505260607f00d30f@mail.gmail.com>
	<4295CAAF.6040502@zetaweb.com>
Message-ID: <e8bf7a5305052606287981283d@mail.gmail.com>

On 5/26/05, Chad Whitacre <chad at zetaweb.com> wrote:
> chad: Hmmm ... I don't follow python-dev closely but hasn't there been
> resistance to macros in Python? Are we saying macros may be a good idea
> after all?
> 
> ?!ng: resistance -> Yes.
> ?!ng: good idea -> Not really.  AST transformations are useful for
> experimenting with the language, but i don't think there is much
> enthusiasm for making these transformations a normal part of the way
> most programs are written.

Right.  We fear macros and prefer to suffer getattr hooks,
descriptors, and decorators <wink>.

Jeremy

From bac at OCF.Berkeley.EDU  Sat May 28 01:18:02 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Fri, 27 May 2005 16:18:02 -0700
Subject: [Python-Dev] Request for dev permissions
In-Reply-To: <d6ejg7$gic$1@sea.gmane.org>
References: <d6ejg7$gic$1@sea.gmane.org>
Message-ID: <4297AAAA.1050000@ocf.berkeley.edu>

Reinhold Birkenfeld wrote:
> Hello,
> 
> would anybody mind if I was given permissions on the tracker and CVS, for fixing small
> things like bug #1202475. I feel that I can help you others out a bit with this and
> I promise I won't change the interpreter to accept braces...
> 

Since no direct follow-up seems to have been given publicly, I wanted to
double-check to see if any movement had been made on this.  I personally don't
know Reinhold and thus cannot vouch for him, but I wanted to make sure he got a
straight yes/no answer out of someone instead of languishing in silence.

-Brett

From bac at OCF.Berkeley.EDU  Sat May 28 01:28:55 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Fri, 27 May 2005 16:28:55 -0700
Subject: [Python-Dev] PEP 342/343 status?
Message-ID: <4297AD37.5000808@ocf.berkeley.edu>

Been rather quite around here lately so I just wanted to do a quick check to
see what the status is on PEPs 342 and 343.  I noticed Nick's PEP is still not
up.  Probably too busy with that fix for genexps in the AST branch, huh, Nick?  =)

Guido, you need something hashed out from us at this point, or do you have this
all settled in your head and are just waiting for time to lock it down in the
PEP?  Or should the PEPs be changed from draft to final and an implementation
(which I am *not* volunteering for  =) is now needed?

-Brett

From ncoghlan at gmail.com  Sat May 28 08:18:24 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 28 May 2005 16:18:24 +1000
Subject: [Python-Dev] PEP 342/343 status?
In-Reply-To: <4297AD37.5000808@ocf.berkeley.edu>
References: <4297AD37.5000808@ocf.berkeley.edu>
Message-ID: <42980D30.1000805@gmail.com>

Brett C. wrote:
> Been rather quite around here lately so I just wanted to do a quick check to
> see what the status is on PEPs 342 and 343.  I noticed Nick's PEP is still not
> up.  Probably too busy with that fix for genexps in the AST branch, huh, Nick?  =)

Something like that. . . still, I finally got around to fixing the formatting in 
the text file and sending it back to David :)

> Guido, you need something hashed out from us at this point, or do you have this
> all settled in your head and are just waiting for time to lock it down in the
> PEP?  Or should the PEPs be changed from draft to final and an implementation
> (which I am *not* volunteering for  =) is now needed?

There's a generator finalisation PEP to come from Raymond, and Guido has made 
some comments on that front that probably need to be folded into PEP 343.

And implementations will probably have to wait for the AST branch anyway. . .

Off-to-fix-the-generator-symbol-table-entries'ly,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From ncoghlan at gmail.com  Sat May 28 14:22:16 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 28 May 2005 22:22:16 +1000
Subject: [Python-Dev] AST branch patches (was Re:  PEP 342/343 status?)
In-Reply-To: <42980D30.1000805@gmail.com>
References: <4297AD37.5000808@ocf.berkeley.edu> <42980D30.1000805@gmail.com>
Message-ID: <42986278.9030001@gmail.com>

Nick Coghlan wrote:
> Brett C. wrote:
> 
>>I noticed Nick's PEP is still not
>>up.  Probably too busy with that fix for genexps in the AST branch, huh, Nick?  =)
> 
> Something like that. . . still, I finally got around to fixing the formatting in 
> the text file and sending it back to David :)

Add to that AST patches for the genexp scoping, lambda nested args and 
creation of distinct code objects for lambdas defined on different 
lines. I guess I finally got over the Python overdose resulting from 
trying to keep up with the PEP 340 discussion :)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From bac at OCF.Berkeley.EDU  Sat May 28 20:52:17 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Sat, 28 May 2005 11:52:17 -0700
Subject: [Python-Dev] AST branch patches (was Re:  PEP 342/343 status?)
In-Reply-To: <42986278.9030001@gmail.com>
References: <4297AD37.5000808@ocf.berkeley.edu> <42980D30.1000805@gmail.com>
	<42986278.9030001@gmail.com>
Message-ID: <4298BDE1.5070603@ocf.berkeley.edu>

Nick Coghlan wrote:
> Nick Coghlan wrote:
> 
>>Brett C. wrote:
>>
>>
>>>I noticed Nick's PEP is still not
>>>up.  Probably too busy with that fix for genexps in the AST branch, huh, Nick?  =)
>>
>>Something like that. . . still, I finally got around to fixing the formatting in 
>>the text file and sending it back to David :)
> 
> 
> Add to that AST patches for the genexp scoping, lambda nested args and 
> creation of distinct code objects for lambdas defined on different 
> lines. I guess I finally got over the Python overdose resulting from 
> trying to keep up with the PEP 340 discussion :)
> 

Wow!  Thanks, Nick!  Now I am the slacker.  =)

To give you and everyone else a general timeline on the AST stuff, I am coming
up on the last week of school for me.  After this upcoming week I have a week
to pack, graduation weekend, and then I start my internship in the Bay Area.
My hope is to put in at least an hour into Python every other night after work
(although, knowing me, that hour will morph into three hours at least  =).  I
am planning to split this time between the AST branch, reworking the dev docs
at python.org/dev, and finally writing the Python 3000 exceptions reorg PEP.  I
am hoping to making some decent headway on the branch this summer before I
leave for UBC.

For those of you who want to keep track of the progress of the AST branch,
there is a running tracker item, bug #1191458
(http://www.python.org/sf/1191458), that lists the currently failing tests.
Once that bug report is closed then the AST is semantically complete.  I do,
though, want to go through and clean up the syntax to match PEP 7 specs before
it gets merged into the mainline once it is semantically working.

-Brett

From skip at pobox.com  Sat May 28 22:30:31 2005
From: skip at pobox.com (Skip Montanaro)
Date: Sat, 28 May 2005 15:30:31 -0500
Subject: [Python-Dev] [Python-checkins] python/dist/src/Lib/test
	test_site.py, 1.6, 1.7
In-Reply-To: <E1DbgHp-0002So-SD@sc8-pr-cvs1.sourceforge.net>
References: <E1DbgHp-0002So-SD@sc8-pr-cvs1.sourceforge.net>
Message-ID: <17048.54503.120092.75031@montanaro.dyndns.org>


    mwh> Fix test_site to not call open('...', 'wU'), as that now raises an
    mwh> error.

    mwh> Is anyone running the test suite regularly at the moment?

Whoops.  I obviously failed to run it after applying that change.  My
apologies.

Skip

From python at rcn.com  Sun May 29 00:59:47 2005
From: python at rcn.com (Raymond Hettinger)
Date: Sat, 28 May 2005 18:59:47 -0400
Subject: [Python-Dev] Request for dev permissions
In-Reply-To: <d6ejg7$gic$1@sea.gmane.org>
Message-ID: <001c01c563d8$f2adf660$8d2dc797@oemcomputer>

[Reinhold Birkenfeld]
> would anybody mind if I was given permissions on the tracker and CVS,
for
> fixing small
> things like bug #1202475. I feel that I can help you others out a bit
with
> this and
> I promise I won't change the interpreter to accept braces...

Let's start out with CVS tracker permissions.
When you have a patch that is really to apply,
upload it to the tracker and assign to me.



Raymond Hettinger

From python at rcn.com  Sun May 29 01:03:43 2005
From: python at rcn.com (Raymond Hettinger)
Date: Sat, 28 May 2005 19:03:43 -0400
Subject: [Python-Dev] Request for dev permissions
In-Reply-To: <001c01c563d8$f2adf660$8d2dc797@oemcomputer>
Message-ID: <001d01c563d9$7f3a2fe0$8d2dc797@oemcomputer>

> Let's start out with CVS tracker permissions.
> When you have a patch that is really to apply,
> upload it to the tracker and assign to me.

really --> ready


From ncoghlan at iinet.net.au  Sun May 29 01:56:01 2005
From: ncoghlan at iinet.net.au (Nick Coghlan)
Date: Sun, 29 May 2005 09:56:01 +1000
Subject: [Python-Dev] PEP 346: User defined statements (formerly known as
	PEP 3XX)
Message-ID: <42990511.6050502@iinet.net.au>

My erstwhile PEP now has a real number (PEP 346), and a home on 
python.org [1].

At my request, it was withdrawn immediately after submission - the 
parts I think are important, Guido is taking on board for PEP 343 [2].

Cheers,
Nick.

[1] http://www.python.org/peps/pep-0346.html

[2] http://mail.python.org/pipermail/python-dev/2005-May/053885.html

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From noamraph at gmail.com  Sun May 29 02:23:59 2005
From: noamraph at gmail.com (Noam Raphael)
Date: Sun, 29 May 2005 03:23:59 +0300
Subject: [Python-Dev] Split MIME headers into multiple lines near a space
Message-ID: <b348a0850505281723415f1a6f@mail.gmail.com>

Hello,

I recently used Python to automatically send messages to my gmail
account. I was surprised to find out that some of the words in the
subjects of messages were split by a space character which came from
nowhere.

It turns out that the international (Hebrew) subject was split into
multiple lines by the email package, sometimes in the middle of words.
Gmail treats these line breaks as spaces, so words gets cut into two.
I've checked, and there are email clients which ignore the line
breaks, so the subject looks ok.

I added four lines to the _binsplit function of email.Header, so that
if there is a space character in the string, it will be splitted
there. This fixes the problem, and subjects look fine again. These
four lines (plus a comment which I wrote) are:

    # Try to find a place in splittable[:i] which is near a space,
    # and split there, so that clients which interpret the line break
    # as a separator won't insert a space in the middle of a word.
    if splittable[i:i+1] != ' ':
        spacepos = splittable.rfind(' ', 0, i)
        if spacepos != -1:
            i = spacepos + 1

These lines should be added before the last three lines of _binsplit.

Do you think it's ok? Could this be added to email.Header?

(Should I send this as a patch? It's just that the patch list was full
of IDLE patches, and this change is really small, so I thought that it
would be easier to post it here. Please tell me if I was wrong.)

Thank you,
Noam Raphael

From nidoizo at yahoo.com  Sun May 29 13:00:24 2005
From: nidoizo at yahoo.com (Nicolas Fleury)
Date: Sun, 29 May 2005 07:00:24 -0400
Subject: [Python-Dev] Adding content to exception messages
In-Reply-To: <428C5502.80503@gmail.com>
References: <20050517055812.42184.qmail@web50908.mail.yahoo.com>
	<428C5502.80503@gmail.com>
Message-ID: <4299A0C8.1030206@yahoo.com>

Nick Coghlan wrote:
> With PEP 344, this could simply be:
> 
>    try:
>        parser.parseFile(file)
>    exeption Exception, exception:
>        raise type(exception)("Error at line %s in file %s" % (x,y))
> 
> Introspectively,
> Nick.
> 

It doesn't work (unless I misundertand you).  For example, the 
exceptions.UnicodeTranslateError constructor needs 4 arguments, not 1. 
That's the reason why I think a parallel mechanism is necessary to add 
additional information when it is necessary to keep the same exception type.

I probably didn't explain myself well too.  Suppose this very hacky 
implementation working with the statu quo:

class ExceptionStr:
     def __init__(self, content):
         self.content = content
         self.infos = []
     def addinfo(self, info):
         self.infos.insert(0, info)
     def __call__(self):
         return '\n'.join(self.infos + [self.content])

import sys
def reraise(exception, info=None):
     strFunc = getattr(exception, "__str__", None)
     if not isinstance(strFunc, ExceptionStr):
         strFunc = ExceptionStr(str(exception))
         exception.__str__ = strFunc
     if info:
         strFunc.addinfo(info)
     raise exception, None, sys.exc_info()[-1]

The following code:

try:
     try:
         raise Exception("hello")
     except Exception, exception:
         reraise(exception, "doing x")
except Exception, exception:
     reraise(exception, "doing y")

would produce something like:

Traceback (most recent call last):
   File "somefile.py", line 7, in ?
     reraise(exception, "doing y")
   File "somefile.py", line 5, in ?
     reraise(exception, "doing x")
   File "somefile..py", line 3, in ?
     raise Exception("hello")
Exception: doing y
doing x
hello

(Note that having the lines 5 and 7 in the traceback is not wanted)

What I propose is to instead have something like:

try:
     try:
         raise Exception("hello")
     except Exception, exception:
         # have some way to reraise a copy of "exception"
         # or the same exception with additional info "doing x"
         # For example:
         exception.addinfo("doing x")
         raise exception from exception.__context__
except Exception, exception:
     # Idem with "doing y"

And have as output:

Traceback (most recent call last):
   File "somefile..py", line 3, in ?
     raise Exception("hello")
Additional info:
   doing y
   doing x
Exception: hello

Regards,
Nicolas


From mwh at python.net  Sun May 29 20:04:53 2005
From: mwh at python.net (Michael Hudson)
Date: Sun, 29 May 2005 19:04:53 +0100
Subject: [Python-Dev] [Python-checkins] python/dist/src/Lib/test
 test_site.py, 1.6, 1.7
In-Reply-To: <17048.54503.120092.75031@montanaro.dyndns.org> (Skip
	Montanaro's message of "Sat, 28 May 2005 15:30:31 -0500")
References: <E1DbgHp-0002So-SD@sc8-pr-cvs1.sourceforge.net>
	<17048.54503.120092.75031@montanaro.dyndns.org>
Message-ID: <2mr7fpx60q.fsf@starship.python.net>

Skip Montanaro <skip at pobox.com> writes:

>     mwh> Fix test_site to not call open('...', 'wU'), as that now raises an
>     mwh> error.
>
>     mwh> Is anyone running the test suite regularly at the moment?
>
> Whoops.  I obviously failed to run it after applying that change.  My
> apologies.

Well, it wasn't just that you didn't run the test suite, obviously
noone else did for about a week, either!

Cheers,
mwh

-- 
  ... the U.S. Department of Transportation today disclosed that its
  agents have recently cleared airport security checkpoints with an 
  M1 tank, a beluga whale, and a fully active South American volcano.
             -- http://www.satirewire.com/news/march02/screeners.shtml

From ncoghlan at gmail.com  Mon May 30 00:02:37 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 30 May 2005 08:02:37 +1000
Subject: [Python-Dev] Split MIME headers into multiple lines near a space
In-Reply-To: <b348a0850505281723415f1a6f@mail.gmail.com>
References: <b348a0850505281723415f1a6f@mail.gmail.com>
Message-ID: <429A3BFD.2040805@gmail.com>

Noam Raphael wrote:
> Do you think it's ok? Could this be added to email.Header?

Noam posted a patch to SF (#1210680), and I assigned it to Barry to 
have a look at. Noam's suggestion seems reasonable to me, but I'm not 
sure what the performance implications are.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From facundobatista at gmail.com  Mon May 30 22:28:13 2005
From: facundobatista at gmail.com (Facundo Batista)
Date: Mon, 30 May 2005 17:28:13 -0300
Subject: [Python-Dev] Deprecating old bugs, now from 2.2.2
Message-ID: <e04bdf3105053013285b2e1d4e@mail.gmail.com>

Going on with the old bugs checking, here are the results for 2.2.2 (and
one from 2.2.1). When I'll finish this will be put in an informational PEP.

When I verified the bug, I filled two fields:

- Summary: the same subject as in SF
- Group: the bug's group at verifying time.
- Bug #: the bug number
- Verified: is the date when I checked the bug.
- Action: is what I did then.

If the bug survived the verification, the next two fields are
applicable (if not, I put a dash, the idea is to keep this info easily
parseable):

- Final: is the action took by someone who eliminated the bug from
  that category (closed, moved to Py2.4, etc).
- By: is the someone who did the final action.

So, here's the info...


Summary:  Python Interpreter shell is crashed
Group:    2.2.2
Bug #:    1100673
Verified: 15-Jan-2005
Action:   Deprecation alerted. Works ok for me.
Final:    Closed. Rejected.
By:       mwh

Summary:  popen3 under threads reports different stderr results
Group:    2.2.2
Bug #:    856706
Verified: 15-Jan-2005
Action:   Closed as duplicate of #853411
Final:    -
By:       -

Summary:  CGIHTTPServer cannot manage cgi in sub directories
Group:    2.2.2
Bug #:    778804
Verified: 15-Jan-2005
Action:   Closing as Fixed (was fixed already).
Final:    -
By:       -

Summary:  popen does not like filenames with spaces
Group:    2.2.2
Bug #:    774546
Verified: 15-Jan-2005
Action:   Deprecation alerted. Should be closed with "Won't fix"
specially now that we have the subprocess module.
Final:    Closed. Out of date.
By:       rhettinger

Summary:  popen4 doesn't close filedescriptors when in Threads
Group:    2.2.2
Bug #:    768649
Verified: 15-Jan-2005
Action:   Deprecation alerted. Works ok for me.
Final:    Closed: Won't fix.
By:       facundobatista

Summary:  Sudden death with SIGSEGV in RtlEnterCriticalSection
Group:    2.2.2
Bug #:    763190
Verified: 15-Jan-2005
Action:   Deprecation alerted. Don't have the context to try it.
Final:    Closed: Won't fix.
By:       facundobatista

Summary:  can't CNTRL-C when running os.system in a thread
Group:    2.2.2
Bug #:    756940
Verified: 15-Jan-2005
Action:   Deprecation alerted. There's other bug similar to this
(#756924) and a patch for it (that has been accepted), but can't try
it right now.
Final:    Closed: Won't fix.
By:       facundobatista

Summary:  Calling socket.recv() with a large number breaks
Group:    2.2.2
Bug #:    756104
Verified: 15-Jan-2005
Action:   Changed to Py2.4. The bug is still there (but in the documentation).
Final:    -
By:       -

Summary:  ftplib.retrbinary fails when called from retrlines callback
Group:    2.2.2
Bug #:    751758
Verified: 15-Jan-2005
Action:   Changed to Py2.4. The bug is still there.
Final:    -
By:       -

Summary:  CGIHTTPServer does not handle scripts in sub-dirs
Group:    2.2.2
Bug #:    737202
Verified: 15-Jan-2005
Action:   Deprecation alerted. There's a patch, but actual code has
something very similar to it, don't have the context to try if it's
fixed.
Final:    Changed to Py2.4. The bug is still there.
By:       facundobatista

Summary:  socketmodule.c: inet_pton() expects 4-byte packed_addr
Group:    2.2.2
Bug #:    730222
Verified: 15-Jan-2005
Action:   Deprecation alerted. Don't have the context to see if it's
already solved (there's a not-applied patch).
Final:    Closed: Won't fix.
By:       facundobatista

Summary:  mmap's resize method resizes the file in win32 but not unix
Group:    2.2.2
Bug #:    728515
Verified: 15-Jan-2005
Action:   Deprecation alerted. Don't know enough about it to check ir
or reproduce it.
Final:    Closed: Won't fix.
By:       facundobatista

Summary:  Core Dumps : Python2.2.2
Group:    2.2.2
Bug #:    727241
Verified: 15-Jan-2005
Action:   Deprecation alerted. Not sure if it's a Python or IRIX bug.
Final:    Closed: Won't fix.
By:       facundobatista

Summary:  inspect, class instances and __getattr__
Group:    2.2.2
Bug #:    718532
Verified: 15-Jan-2005
Action:   Deprecation alerted. Not sure if there's a bug.
Final:    Changed to Py2.4.
By:       facundobatista

Summary:  &quot;build_ext&quot; &quot;libraries&quot; subcommand not s
Group:    2.2.2
Bug #:    716634
Verified: 15-Jan-2005
Action:   Deprecation alerted. Not sure if there's a bug.
Final:    Closed: Won't fix.
By:       facundobatista

Summary:  SEEK_{SET,CUR,END} missing in 2.2.2
Group:    2.2.2
Bug #:    711830
Verified: 15-Jan-2005
Action:   Deprecation alerted. Not sure if there's a bug.
Final:    Closed. Fixed.
By:       loewis

Summary:  codecs.open and iterators
Group:    2.2.2
Bug #:    706595
Verified: 15-Jan-2005
Action:   Deprecation alerted. Couldn't reproduce the problem, don't
know enough about the subject to be sure if it's a bug or not.
Final:    Closed. Out of date.
By:       doerwalter

Summary:  test_atexit fails in directories with spaces
Group:    2.2.2
Bug #:    705792
Verified: 15-Jan-2005
Action:   Closing as Fixed (was fixed already).
Final:    -
By:       -

Summary:  --without-cxx flag of configure isn't documented.
Group:    2.2.2
Bug #:    702147
Verified: 15-Jan-2005
Action:   Deprecation alerted. The offerent never send the proposed
patch for docs, really not clear if there's something to be fixed.
Final:    Closed. Out of date.
By:       bcannon

Summary:  Thread running (os.system or popen#)
Group:    2.2.2
Bug #:    701836
Verified: 15-Jan-2005
Action:   Deprecation alerted. Works ok for me.
Final:    Closed: Won't fix.
By:       facundobatista

Summary:  Canvas origin is off-canvas in create_&lt;item&gt;(). Worka
Group:    2.2.2
Bug #:    700650
Verified: 15-Jan-2005
Action:   Deprecation alerted. Missing attached file, depend on that
to verify the bug.
Final:    Closed: Won't fix.
By:       facundobatista

Summary:  Canvas Widget origin is off-screen
Group:    2.2.2
Bug #:    699816
Verified: 15-Jan-2005
Action:   Deprecation alerted. Missing attached file, depend on that
to verify the bug.
Final:    Closed: Won't fix.
By:       facundobatista

Summary:  imaplib: parsing INTERNALDATE
Group:    2.2.2
Bug #:    698706
Verified: 15-Jan-2005
Action:   Changed to Py2.4. The bug is still there.
Final:    -
By:       -

Summary:  _iscommand() in webbrowser module
Group:    2.2.2
Bug #:    687747
Verified: 15-Jan-2005
Action:   Deprecation alerted. Don't have the context to try it.
Final:    Changed to Py2.3.
By:       facundobatista

Summary:  Profilier hooked into SystemExit
Group:    2.2.2
Bug #:    687297
Verified: 11-Jan-2005
Action:   Deprecation alerted. Don't have the context to try it.
Final:    Closed: Won't fix.
By:       facundobatista

Summary:  Incorrect permissions set in lib-dynload.
Group:    2.2.2
Bug #:    680379
Verified: 11-Jan-2005
Action:   Deprecation alerted. Don't have the context to try it.
Final:    Closed: Fixed.
By:       facundobatista

Summary:  String formatting operation Unicode problem.
Group:    2.2.2
Bug #:    676346
Verified: 11-Jan-2005
Action:   Deprecation alerted. Don't know enough about it to be sure
if it's really a bug.
Final:    Closed: Fixed.
By:       facundobatista

Summary:  Py_Main() does not perform to spec
Group:    2.2.2
Bug #:    672035
Verified: 11-Jan-2005
Action:   Closing it, put in "Fixed" previously but still was "Open".
Final:    -
By:       -

Summary:  os.popen+() can take string list and bypass shell.
Group:    2.2.2
Bug #:    666700
Verified: 11-Jan-2005
Action:   Deprecation alerted. Not sure if there's a bug.
Final:    Closed: Fixed.
By:       facundobatista

Summary:  doctest and exception messages
Group:    2.2.2
Bug #:    654783
Verified: 11-Jan-2005
Action:   Deprecation alerted. Not sure if there's a bug.
Final:    Closed: Won't fix.
By:       facundobatista

Summary:  for lin in file: file.tell() tells wrong
Group:    2.2.2
Bug #:    645594
Verified: 11-Jan-2005
Action:   Closed because of #1036626 rationale.
Final:    -
By:       -

Summary:  Misuse of /usr/local/in setup.py
Group:    2.2.2
Bug #:    640553
Verified: 08-Jan-2005
Action:   Deprecation alerted. Because of the bug discussion is not
clear to me if the problem actually existed.
Final:    Closed: Won't fix.
By:       facundobatista

Summary:  crash (SEGV) in Py_EndInterpreter()
Group:    2.2.2
Bug #:    639611
Verified: 08-Jan-2005
Action:   Deprecation alerted. Don't know enough about the subject to
be able to reproduce the issue.
Final:    Closed: Works for me.
By:       jlgijsbers

Summary:  Tkinter: BitmapImage vanishes if not stored in non-local var
Group:    2.2.2
Bug #:    632323
Verified: 27-Dec-2004
Action:   Deprecation alerted. Don't know what to do with this one.
It's marked as Invalid but still Open, and it's not clear to me if was
a bug, is still a bug, or any of both.
Final:    Changed to Py2.3.
By:       facundobatista

Summary:  test_signal hangs -- signal broken on OpenBSD?
Group:    2.2.1 candidate
Bug #:    549081
Verified: 26-Dic-2004
Action:   Deprecation alerted. I can not try it, don't have that
context. For a comment maybe it's already fixed.
Final:    Closed: Won't fix.
By:       facundobatista


Regards,

.    Facundo

Blog: http://www.taniquetil.com.ar/plog/
PyAr: http://www.python.org/ar/

From facundobatista at gmail.com  Mon May 30 22:41:20 2005
From: facundobatista at gmail.com (Facundo Batista)
Date: Mon, 30 May 2005 17:41:20 -0300
Subject: [Python-Dev] Old Python version categories in Bug Tracker
Message-ID: <e04bdf3105053013417164dc8f@mail.gmail.com>

People:

As the process of deprecating old bugs evolves, the following
categories got empty:

    Python 2.1.1
    Python 2.1.2
    Python 2.2.1
    Python 2.2.1 candidate
    Python 2.2.2

The SF interface doesn't allow to delete old categories, but maybe we
could ask SF support to do it, as nowadays doesn't make sense to open
a bug in these categories.

As an infortunate example, somebody opened today a bug in the "Python
2.2" category, :(

What do you think? 

Thanks. 

.    Facundo

Blog: http://www.taniquetil.com.ar/plog/
PyAr: http://www.python.org/ar/

From python at rcn.com  Mon May 30 23:06:52 2005
From: python at rcn.com (Raymond Hettinger)
Date: Mon, 30 May 2005 17:06:52 -0400
Subject: [Python-Dev] Old Python version categories in Bug Tracker
In-Reply-To: <e04bdf3105053013417164dc8f@mail.gmail.com>
Message-ID: <000601c5655b$810efe80$5807a044@oemcomputer>

> As the process of deprecating old bugs evolves, the following
> categories got empty:
> 
>     Python 2.1.1
>     Python 2.1.2
>     Python 2.2.1
>     Python 2.2.1 candidate
>     Python 2.2.2


That's great news.



> The SF interface doesn't allow to delete old categories, but maybe we
> could ask SF support to do it, as nowadays doesn't make sense to open
> a bug in these categories.
> 
> As an infortunate example, somebody opened today a bug in the "Python
> 2.2" category, :(
> 
> What do you think?

There's no harm in having these surface.  If the category is accurate,
let's use it.  If the bug is out of date, we can mark it as such and
close it.


Raymond

From bac at OCF.Berkeley.EDU  Tue May 31 01:24:42 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Mon, 30 May 2005 16:24:42 -0700
Subject: [Python-Dev] Deprecating old bugs, now from 2.2.2
In-Reply-To: <e04bdf3105053013285b2e1d4e@mail.gmail.com>
References: <e04bdf3105053013285b2e1d4e@mail.gmail.com>
Message-ID: <429BA0BA.9020703@ocf.berkeley.edu>

Facundo Batista wrote:
> Going on with the old bugs checking, here are the results for 2.2.2 (and
> one from 2.2.1). When I'll finish this will be put in an informational PEP.
> 

Great work, Facundo!  Now I feel lazy.  =)

-Brett

From facundobatista at gmail.com  Tue May 31 01:46:10 2005
From: facundobatista at gmail.com (Facundo Batista)
Date: Mon, 30 May 2005 20:46:10 -0300
Subject: [Python-Dev] Deprecating old bugs, now from 2.2.2
In-Reply-To: <429BA0BA.9020703@ocf.berkeley.edu>
References: <e04bdf3105053013285b2e1d4e@mail.gmail.com>
	<429BA0BA.9020703@ocf.berkeley.edu>
Message-ID: <e04bdf310505301646276f1b1d@mail.gmail.com>

On 5/30/05, Brett C. <bac at ocf.berkeley.edu> wrote:

> Facundo Batista wrote:
> > Going on with the old bugs checking, here are the results for 2.2.2 (and
> > one from 2.2.1). When I'll finish this will be put in an informational PEP.
> >
> 
> Great work, Facundo!  Now I feel lazy.  =)

C'mon! Just a well used day from my vacations, ;)

Anyway, thank you very much, and keep doing *your* good work (of which
I'm not nearly capable).

Regards,

.    Facundo

Blog: http://www.taniquetil.com.ar/plog/
PyAr: http://www.python.org/ar/

From fdrake at acm.org  Tue May 31 04:17:48 2005
From: fdrake at acm.org (Fred L. Drake, Jr.)
Date: Mon, 30 May 2005 22:17:48 -0400
Subject: [Python-Dev] Old Python version categories in Bug Tracker
In-Reply-To: <000601c5655b$810efe80$5807a044@oemcomputer>
References: <000601c5655b$810efe80$5807a044@oemcomputer>
Message-ID: <200505302217.48489.fdrake@acm.org>

On Monday 30 May 2005 17:06, Raymond Hettinger wrote:
 > There's no harm in having these surface.  If the category is accurate,
 > let's use it.  If the bug is out of date, we can mark it as such and
 > close it.

While we can't (and shouldn't) delete categories, we can change the text used 
to describe them.  So "Python 2.2.2" can become "Python 2.2.2 
(unmaintained)".  Whether this is desirable or not, I'm not sure.


  -Fred

-- 
Fred L. Drake, Jr.   <fdrake at acm.org>

From jcarlson at uci.edu  Tue May 31 04:46:48 2005
From: jcarlson at uci.edu (Josiah Carlson)
Date: Mon, 30 May 2005 19:46:48 -0700
Subject: [Python-Dev] Old Python version categories in Bug Tracker
In-Reply-To: <200505302217.48489.fdrake@acm.org>
References: <000601c5655b$810efe80$5807a044@oemcomputer>
	<200505302217.48489.fdrake@acm.org>
Message-ID: <20050530193231.9B77.JCARLSON@uci.edu>


"Fred L. Drake, Jr." <fdrake at acm.org> wrote:
> 
> On Monday 30 May 2005 17:06, Raymond Hettinger wrote:
>  > There's no harm in having these surface.  If the category is accurate,
>  > let's use it.  If the bug is out of date, we can mark it as such and
>  > close it.
> 
> While we can't (and shouldn't) delete categories, we can change the text used 
> to describe them.  So "Python 2.2.2" can become "Python 2.2.2 
> (unmaintained)".  Whether this is desirable or not, I'm not sure.

One can also create a new group for "really old stuff", place a comment
that describes the original group (something like "PreviousGroup..."),
then when closing/wont fix/etc., move the object into the new group.

Then we can rename/reuse those 2.1 and 2.2 categories as desired.

If one wants to search for items that used to be in some group, one
can use the search functionality of sourceforge.


Pain in the rear, but it is a solution to the 'problem' of having 2.1
and 2.2 groups.

 - Josiah

P.S. I personally wouldn't do this; I would rename as Fred suggests.


From facundobatista at gmail.com  Tue May 31 04:48:55 2005
From: facundobatista at gmail.com (Facundo Batista)
Date: Mon, 30 May 2005 23:48:55 -0300
Subject: [Python-Dev] Old Python version categories in Bug Tracker
In-Reply-To: <200505302217.48489.fdrake@acm.org>
References: <000601c5655b$810efe80$5807a044@oemcomputer>
	<200505302217.48489.fdrake@acm.org>
Message-ID: <e04bdf3105053019481034d146@mail.gmail.com>

On 5/30/05, Fred L. Drake, Jr. <fdrake at acm.org> wrote:

> While we can't (and shouldn't) delete categories, we can change the text used
> to describe them.  So "Python 2.2.2" can become "Python 2.2.2
> (unmaintained)".  Whether this is desirable or not, I'm not sure.

+1 for this solution.

We (aka "this list", ;) took the decision of deprecate old bugs (more
than one version to the past, <=2.2.* at this moment) if original
poster or commenters didn't update them. So, I think is fair enough to
inform people in that way when they want to post a bug to that
versions.

Opinions?

Thanks!

.    Facundo

Blog: http://www.taniquetil.com.ar/plog/
PyAr: http://www.python.org/ar/

From fdrake at acm.org  Tue May 31 05:09:35 2005
From: fdrake at acm.org (Fred L. Drake, Jr.)
Date: Mon, 30 May 2005 23:09:35 -0400
Subject: [Python-Dev] Old Python version categories in Bug Tracker
In-Reply-To: <20050530193231.9B77.JCARLSON@uci.edu>
References: <000601c5655b$810efe80$5807a044@oemcomputer>
	<200505302217.48489.fdrake@acm.org>
	<20050530193231.9B77.JCARLSON@uci.edu>
Message-ID: <200505302309.35302.fdrake@acm.org>

On Monday 30 May 2005 22:46, Josiah Carlson wrote:
 > Pain in the rear, but it is a solution to the 'problem' of having 2.1
 > and 2.2 groups.

The issue is really that it's not clear that this is a real problem.  
Unfortunate, yes, but that's it.  Ideally, there'd be a way to say that 
certain categories/groups were marked unusable for new submissions.  But 
that's not necessary in any way.


  -Fred

-- 
Fred L. Drake, Jr.   <fdrake at acm.org>

From bac at OCF.Berkeley.EDU  Tue May 31 05:38:06 2005
From: bac at OCF.Berkeley.EDU (Brett C.)
Date: Mon, 30 May 2005 20:38:06 -0700
Subject: [Python-Dev] Old Python version categories in Bug Tracker
In-Reply-To: <e04bdf3105053019481034d146@mail.gmail.com>
References: <000601c5655b$810efe80$5807a044@oemcomputer>	<200505302217.48489.fdrake@acm.org>
	<e04bdf3105053019481034d146@mail.gmail.com>
Message-ID: <429BDC1E.4050705@ocf.berkeley.edu>

Facundo Batista wrote:
> On 5/30/05, Fred L. Drake, Jr. <fdrake at acm.org> wrote:
> 
> 
>>While we can't (and shouldn't) delete categories, we can change the text used
>>to describe them.  So "Python 2.2.2" can become "Python 2.2.2
>>(unmaintained)".  Whether this is desirable or not, I'm not sure.
> 
> 
> +1 for this solution.
> 

+1 from me as well.

-Brett

From reinhold-birkenfeld-nospam at wolke7.net  Tue May 31 10:28:47 2005
From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld)
Date: Tue, 31 May 2005 10:28:47 +0200
Subject: [Python-Dev] Request for dev permissions
In-Reply-To: <001c01c563d8$f2adf660$8d2dc797@oemcomputer>
References: <d6ejg7$gic$1@sea.gmane.org>
	<001c01c563d8$f2adf660$8d2dc797@oemcomputer>
Message-ID: <d7h6vp$crd$1@sea.gmane.org>

Raymond Hettinger wrote:
> [Reinhold Birkenfeld]
>> would anybody mind if I was given permissions on the tracker and CVS,
> for
>> fixing small
>> things like bug #1202475. I feel that I can help you others out a bit
> with
>> this and
>> I promise I won't change the interpreter to accept braces...
> 
> Let's start out with CVS tracker permissions.
> When you have a patch that is really to apply,
> upload it to the tracker and assign to me.

OK, thanks.

Reinhold

-- 
Mail address is perfectly valid!


From ncoghlan at gmail.com  Tue May 31 11:06:57 2005
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 31 May 2005 19:06:57 +1000
Subject: [Python-Dev] Old Python version categories in Bug Tracker
In-Reply-To: <429BDC1E.4050705@ocf.berkeley.edu>
References: <000601c5655b$810efe80$5807a044@oemcomputer>	<200505302217.48489.fdrake@acm.org>	<e04bdf3105053019481034d146@mail.gmail.com>
	<429BDC1E.4050705@ocf.berkeley.edu>
Message-ID: <429C2931.6040404@gmail.com>

Brett C. wrote:
> Facundo Batista wrote:
> 
>>On 5/30/05, Fred L. Drake, Jr. <fdrake at acm.org> wrote:
>>
>>
>>
>>>While we can't (and shouldn't) delete categories, we can change the text used
>>>to describe them.  So "Python 2.2.2" can become "Python 2.2.2
>>>(unmaintained)".  Whether this is desirable or not, I'm not sure.
>>
>>
>>+1 for this solution.
> 
> 
> +1 from me as well.

Sounds good to me, too, as long as a bit of explanatory text is added 
to the 'submit new' page. (e.g. "If no other groups apply, please use 
the group for your major Python version. Bugs reported against 
unmaintained versions of Python will not be fixed unless the bug can 
be reproduced using one of the maintained versions.")

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------
             http://boredomandlaziness.blogspot.com

From raymond.hettinger at verizon.net  Tue May 31 13:53:44 2005
From: raymond.hettinger at verizon.net (Raymond Hettinger)
Date: Tue, 31 May 2005 07:53:44 -0400
Subject: [Python-Dev] Closing old bugs
Message-ID: <000001c565d7$65bebc20$5807a044@oemcomputer>

There should be some greater care exercised in closing old bugs.
Marking them "deprecated" and then erasing them is only a good strategy
if we have no means of reproducing the error or ascertaining what the OP
was talking about. 

For instance, in www.python.org/sf/640553 , it was possible for a
reviewer to directly verify whether usr/local local was still being used
in setup.py.  Likewise, www.python.org/sf/728515 should not have been
closed (Martin's post could have been taken as a clue that the bug was
valid and simply waiting for some volunteer to submit a patch).

Old age and a missing OP is not sufficient reason to close a bug.  If
the report is clear and the bug is potentially still valid, it should be
left open.  Efforts to clear old bugs should focus on fixing them or
making a conscious Won't Fix decision (with old age possibly indicating
that there is not a real problem in practice).



Raymond


P.S.  When setting a time deadline for an OP to respond, we should use
some terminology other than "deprecated".  The word doesn't fit well and
can be confused with the unrelated topic of module or feature
deprecation.


From gvanrossum at gmail.com  Tue May 31 20:12:16 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 31 May 2005 11:12:16 -0700
Subject: [Python-Dev] PEP 342/343 status?
In-Reply-To: <42980D30.1000805@gmail.com>
References: <4297AD37.5000808@ocf.berkeley.edu> <42980D30.1000805@gmail.com>
Message-ID: <ca471dc205053111123eeff28b@mail.gmail.com>

[Brett C.]
> > Been rather quite around here lately so I just wanted to do a quick check to
> > see what the status is on PEPs 342 and 343.  I noticed Nick's PEP is still not
> > up.  Probably too busy with that fix for genexps in the AST branch, huh, Nick?  =)

[Nick Coghlan]
> Something like that. . . still, I finally got around to fixing the formatting in
> the text file and sending it back to David :)
> 
> > Guido, you need something hashed out from us at this point, or do you have this
> > all settled in your head and are just waiting for time to lock it down in the
> > PEP?  Or should the PEPs be changed from draft to final and an implementation
> > (which I am *not* volunteering for  =) is now needed?
> 
> There's a generator finalisation PEP to come from Raymond, and Guido has made
> some comments on that front that probably need to be folded into PEP 343.

Yeah, I'm mostly waiting for Raymond to make the changes to PEP 288. I
guess the flap over Decimal's constructor (nicely concluded by Mike
Cowlishaw -- we should invite him to do a keynote on EuroPython next
year) and the uncertainty of the possibility to call close() from the
destructor may have slowed that down. I wonder if we can make progress
by leaving that final part (automatic invocation of close() upon
destruction) out -- I don't think that anything in the rest of the
spec would have to change if we can't call close() automatically.

If Raymond would rather defer to me, I can give it a shot in a revised
version of PEP 343, at the same time as finishing up some other loose
ends there (e.g. I'd like to switch allegiance to 'with').

> And implementations will probably have to wait for the AST branch anyway. . .

That's fine. But don't underestimate the VM work needed! I looked at
what it would take to add an optional argument to <generator>.next()
so that yield could return a value, and realized that I don't know my
way around ceval.c any more... :-(

If people are interested in sprinting on this at EuroPython I could be
available for some time, but we'll need someone more knowledgeable
about ceval.c to kick-start the work.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Tue May 31 20:41:46 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 31 May 2005 14:41:46 -0400
Subject: [Python-Dev] PEP 342/343 status?
In-Reply-To: <ca471dc205053111123eeff28b@mail.gmail.com>
References: <42980D30.1000805@gmail.com> <4297AD37.5000808@ocf.berkeley.edu>
	<42980D30.1000805@gmail.com>
Message-ID: <5.1.1.6.0.20050531142934.02471690@mail.telecommunity.com>

At 11:12 AM 5/31/2005 -0700, Guido van Rossum wrote:
>uncertainty of the possibility to call close() from the
>destructor may have slowed that down.

If you're talking about the bit about __del__ not working when hanging off 
a cycle, my apologies for creating that confusion, I misunderstood Tim's post.

If you're talking about the circular reference scenario involving 
exceptions, then I think I've already given a tentative explanation why 
such a cycle will not include the generator-iterator itself (only its 
frame) as long as the generator-iterator clears its frame's f_back when the 
frame is not in use.  It would probably be a good idea to verify this, but 
I think the basic idea is sound.

The only scenario in which a generator-iterator would be uncollectable is 
if you took some action outside the iterator to pass it into itself, or if 
you stored the iterator in a global variable.  However, even in the latter 
case, the phase of Python shutdown that clears module contents would break 
the cycle and cause the generator to finalize, albeit a bit late and 
perhaps in a broken way.  Still, storage of generator-iterators in module 
globals is a very unlikely scenario, since it's more common to loop over 
such iterators than it is to store them in any kind of variable.


From gvanrossum at gmail.com  Tue May 31 21:03:14 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 31 May 2005 12:03:14 -0700
Subject: [Python-Dev] PEP 342/343 status?
In-Reply-To: <5.1.1.6.0.20050531142934.02471690@mail.telecommunity.com>
References: <4297AD37.5000808@ocf.berkeley.edu> <42980D30.1000805@gmail.com>
	<ca471dc205053111123eeff28b@mail.gmail.com>
	<5.1.1.6.0.20050531142934.02471690@mail.telecommunity.com>
Message-ID: <ca471dc2050531120339b09d17@mail.gmail.com>

[Guido]
> >uncertainty of the possibility to call close() from the
> >destructor may have slowed that down.

[Phillip]
> If you're talking about the bit about __del__ not working when hanging off
> a cycle, my apologies for creating that confusion, I misunderstood Tim's post.

No, I'm not talking about that.

> If you're talking about the circular reference scenario involving
> exceptions, then I think I've already given a tentative explanation why
> such a cycle will not include the generator-iterator itself (only its
> frame) as long as the generator-iterator clears its frame's f_back when the
> frame is not in use.  It would probably be a good idea to verify this, but
> I think the basic idea is sound.

Yes, the generator does clear its f_back when it's suspended.

> The only scenario in which a generator-iterator would be uncollectable is
> if you took some action outside the iterator to pass it into itself, or if
> you stored the iterator in a global variable.  However, even in the latter
> case, the phase of Python shutdown that clears module contents would break
> the cycle and cause the generator to finalize, albeit a bit late and
> perhaps in a broken way.  Still, storage of generator-iterators in module
> globals is a very unlikely scenario, since it's more common to loop over
> such iterators than it is to store them in any kind of variable.

Sure. But I still have some reservations, since cycles can pop up in
the strangest of places (especially when tracebacks are involved --
tracebacks have been causing problems due to cycles and keeping
variables alive almost since Python's inception). I posted about this
a while ago, but don't recall seeing a response that took my fear
away.

Unfortunately, discussing this is about 1000x easier when you have a
shared drawing surface available -- I cannot even think about this
myself without making drawings of object references...

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)

From pje at telecommunity.com  Tue May 31 21:27:38 2005
From: pje at telecommunity.com (Phillip J. Eby)
Date: Tue, 31 May 2005 15:27:38 -0400
Subject: [Python-Dev] PEP 342/343 status?
In-Reply-To: <ca471dc2050531120339b09d17@mail.gmail.com>
References: <5.1.1.6.0.20050531142934.02471690@mail.telecommunity.com>
	<4297AD37.5000808@ocf.berkeley.edu> <42980D30.1000805@gmail.com>
	<ca471dc205053111123eeff28b@mail.gmail.com>
	<5.1.1.6.0.20050531142934.02471690@mail.telecommunity.com>
Message-ID: <5.1.1.6.0.20050531151230.021f4fb0@mail.telecommunity.com>

At 12:03 PM 5/31/2005 -0700, Guido van Rossum wrote:
>Yes, the generator does clear its f_back when it's suspended.

I realize this won't fix all your worries; I just want to rule out this one 
*particular* form of cycle as a possibility; i.e., to show that mere 
reference to a generator-iterator in a frame does not create a cycle:


     callerframe ---------> traceback2
        |     ^                |  |
        |     |                |  |
        |     +----------------+  |
        v                         v
     geniter -> genframe -> traceback1
                    ^          |
                    |          |
                    +----------+

As you can see, the geniter itself doesn't have a reference to its calling 
frame, so as soon as the highest-level traceback object is released, the 
cycle collector should release the upper cycle, allowing the geniter to 
complete, and releasing the lower cycle.

The scenario assumes, by the way, that the traceback object referenced by a 
frame includes a pointer to that same frame, which I'm not sure is the 
case.  I was under the impression that the current frame is only added to 
the traceback when the frame is exited, in which case the two cycles shown 
above wouldn't even exist; each traceback  would be pointing to the *next* 
frame down, and there would be no cycles at all.  It seems to me that this 
would almost have to be the design, since tracebacks existed before cyclic 
GC did.


>Sure. But I still have some reservations, since cycles can pop up in
>the strangest of places (especially when tracebacks are involved --
>tracebacks have been causing problems due to cycles and keeping
>variables alive almost since Python's inception). I posted about this
>a while ago, but don't recall seeing a response that took my fear
>away.

Well, I can't prove that it's not possible to create such cycles, 
certainly.  But maybe we should finally get rid of the deprecated 
sys.exc_type/value/traceback variables, so that they can't root any cycles?


From python at rcn.com  Tue May 31 21:23:23 2005
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 31 May 2005 15:23:23 -0400
Subject: [Python-Dev] PEP 342/343 status?
In-Reply-To: <ca471dc205053111123eeff28b@mail.gmail.com>
Message-ID: <000a01c56616$36d0e2a0$0339c797@oemcomputer>

> If Raymond would rather defer to me, I can give it a shot in a revised
> version of PEP 343, 

Thanks, that would be great.  The decimal conversation used up a lot of
my available cycles.



Raymond

From gvanrossum at gmail.com  Tue May 31 22:30:30 2005
From: gvanrossum at gmail.com (Guido van Rossum)
Date: Tue, 31 May 2005 13:30:30 -0700
Subject: [Python-Dev] PEP 342/343 status?
In-Reply-To: <5.1.1.6.0.20050531151230.021f4fb0@mail.telecommunity.com>
References: <4297AD37.5000808@ocf.berkeley.edu> <42980D30.1000805@gmail.com>
	<ca471dc205053111123eeff28b@mail.gmail.com>
	<5.1.1.6.0.20050531142934.02471690@mail.telecommunity.com>
	<ca471dc2050531120339b09d17@mail.gmail.com>
	<5.1.1.6.0.20050531151230.021f4fb0@mail.telecommunity.com>
Message-ID: <ca471dc205053113303f4285d2@mail.gmail.com>

[Guido]
> >Yes, the generator does clear its f_back when it's suspended.

[Phillip]
> I realize this won't fix all your worries; I just want to rule out
> this one *particular* form of cycle as a possibility; i.e., to show
> that mere reference to a generator-iterator in a frame does not
> create a cycle:
> 
> 
>      callerframe ---------> traceback2
>         |     ^                |  |
>         |     |                |  |
>         |     +----------------+  |
>         v                         v
>      geniter -> genframe -> traceback1
>                     ^          |
>                     |          |
>                     +----------+
> 
> As you can see, the geniter itself doesn't have a reference to its
> calling frame, so as soon as the highest-level traceback object is
> released, the cycle collector should release the upper cycle,
> allowing the geniter to complete, and releasing the lower cycle.
> 
> The scenario assumes, by the way, that the traceback object
> referenced by a frame includes a pointer to that same frame, which
> I'm not sure is the case.  I was under the impression that the
> current frame is only added to the traceback when the frame is
> exited, in which case the two cycles shown above wouldn't even
> exist; each traceback would be pointing to the *next* frame down,
> and there would be no cycles at all.  It seems to me that this would
> almost have to be the design, since tracebacks existed before cyclic
> GC did.

Alas, your assumption is valid; this would indeed cause a cycle, much
to the despair of early Python programmers.  There used to be a whole
body of literature about the best way to avoid this (never save a
traceback, or if you do, clear it when you're done with it before
exiting the frame).  When you raise and immediately catch an
exception, there is a single traceback object that references the
current frame (the frame where it was raised *and* caught).  So if you
store sys.exc_info() in a local, you have a cycle already:

  try:
      raise Exception
  except:
      x = sys.exc_info()[2]   # save the traceback

Now we have the following cycle (with slightyl more detail than your
diagram):

             tb_frame
    frame <------------- traceback
      |                       ^
      |                       |
      v         'x'           |
   f_locals ------------------+

BTW, note the repercussions this has for Ping's PEP 344 -- because
the Exception instance references the traceback in that proposal, all
code that catches an exception into a variable creates a cycle, like
this:

  try:
      raise Exception
  except Exception, err:
      pass

This would creates the following cycle:

             tb_frame
    frame <------------- traceback
      |                       ^
      |                       | __traceback__
      v                       |
   f_locals ---------------> err

The good news in all this is that none of these objects has a __del__
method in the proposal; only the 'geniter' object would have one, and
getting it involved in a cycle does seem like rather unlikely.  I
hereby declare my worries unwarranted and will happily add language to
the revamped PEP 343 that a geniter object should have a tp_del slot
and a corresponding __del__ attribute.  This further complicates
has_finalizer() in gcmodule.c, to the point where the latter might
have to be turned into an internal slot.

> >Sure. But I still have some reservations, since cycles can pop up
> >in the strangest of places (especially when tracebacks are involved
> >-- tracebacks have been causing problems due to cycles and keeping
> >variables alive almost since Python's inception). I posted about
> >this a while ago, but don't recall seeing a response that took my
> >fear away.
> 
> Well, I can't prove that it's not possible to create such cycles,
> certainly.  But maybe we should finally get rid of the deprecated
> sys.exc_type/value/traceback variables, so that they can't root any
> cycles?

I sort of doubt that these are the main source of live cycles.  After
all, they are reset whenever a frame is popped (grep the sources for
reset_exc_info).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)