From ntoronto at cs.byu.edu  Thu Jan 10 02:25:10 2008
From: ntoronto at cs.byu.edu (Neil Toronto)
Date: Wed, 09 Jan 2008 18:25:10 -0700
Subject: [Python-ideas] Batty idea of the week: no implementation inheritance
Message-ID: <478573F6.7030909@cs.byu.edu>

It occurred to me that every problem I've had with object-oriented 
programming stems from the fact that subclasses implicitly inherit 
implementation details from their parent classes. It also occurred to me 
that I've never solved a problem with implementation inheritance that 
couldn't be done via composition and delegation approximately as easily.

This kind of delegation would be similar to subclassing:

     class A:
         def foo(self):
             # do something cool

     class B:  # Not a subclass of A, but acts like one
         def __init__(self):
             self.a = A()

         def foo(self):
             # do something before
             ret = self.a.foo()
             # do something after
             return ret
         # or foo = A.foo for simple cases?

For lack of a better term (I'm probably just not well-versed enough in 
language theory) I'll call it reverse delegation. The big UI difference 
between this and subclassing is that here A retains control over its 
implementation and its writer doesn't have to worry about whether 
changing anything will break subclasses - because there aren't any.

There's also forward delegation for when a class doesn't want to care 
about how something is implemented:

     class A:
         def __init__(self, barimpl):
             self.barimpl = barimpl  # or gets it by some other means

         def bar(self):
             return self.barimpl.bar()

     class B:
         def bar(self):
             # do something cool

For some reason this gets called "dependency injection". (I know why, 
but IMO it's a horrible name.) It's typical OO practice to solve the 
same problem by defining A.bar as abstract, though it's increasingly 
frowned upon in favor of forward delegation / dependency injection.

It occurred to me that this sort of thing would work perfectly with duck 
typing, and even duck typing plus something like Java's interfaces. In 
the first example, as long as A and B have the same interface they could 
each pass the same type checks. In the second, A could define an 
interface containing bar (or more) and accept anything implementing it.

The big downside to both delegation types is that they're wordy, 
especially with interfaces added. With the right syntactic support (I 
admit I haven't given this much thought) it might be as nice as 
implementation inheritance. I also have a niggling feeling that I'm 
missing something, especially with regards to multiple inheritance, 
though I've never understood what the big deal was with that past 
inheriting an interface or convenient mixins.

Neil


From aaron.watters at gmail.com  Thu Jan 10 19:37:32 2008
From: aaron.watters at gmail.com (Aaron Watters)
Date: Thu, 10 Jan 2008 13:37:32 -0500
Subject: [Python-ideas] o(n**2) problem with marshal.dumps for large objects,
	with patch
Message-ID: <fc13a6500801101037m68fcbfaeyd10a5881788ad58c@mail.gmail.com>

Hi  folks.  Much to my surprise I found that one of
my applications seemed to be running slow as a result
of marshal.dumps.  I think the culprit is the w_more(...)
function, which grows the marshal buffer in 1k units.
This means that a marshal of size 100k will have 100
reallocations and string copies.  Other parts of Python
(and java etc) have a proportional reallocation strategy
which reallocates a new size based on the existing size.
This mean a 100k marshal requires just 5 or so
reallocations and string copies (n log n versus n**2
asymptotic performance).

I humbly submit the following patch (based on python 2.6a0
source).  I think it is a strict improvement on the existing
code, but I've been wrong before (twice ;)).
   -- Aaron Watters

PATCH FOLLOWS

*** marshal.c.original  2008-01-10 10:15:40.686838800 -0500
--- marshal.c   2008-01-10 11:32:01.838654000 -0500
***************
*** 61,75 ****
  static void
  w_more(int c, WFILE *p)
  {
        Py_ssize_t size, newsize;
        if (p->str == NULL)
                return; /* An error already occurred */
        size = PyString_Size(p->str);
!       newsize = size + 1024;
        if (_PyString_Resize(&p->str, newsize) != 0) {
                p->ptr = p->end = NULL;
        }
        else {
                p->ptr = PyString_AS_STRING((PyStringObject *)p->str) +
size;
                p->end =
                        PyString_AS_STRING((PyStringObject *)p->str) +
newsize;
--- 61,76 ----
  static void
  w_more(int c, WFILE *p)
  {
        Py_ssize_t size, newsize;
        if (p->str == NULL)
                return; /* An error already occurred */
        size = PyString_Size(p->str);
!       newsize = size + size + 1024;
!       /* printf("new size %d\n", newsize); */
        if (_PyString_Resize(&p->str, newsize) != 0) {
                p->ptr = p->end = NULL;
        }
        else {
                p->ptr = PyString_AS_STRING((PyStringObject *)p->str) +
size;
                p->end =
                        PyString_AS_STRING((PyStringObject *)p->str) +
newsize;
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20080110/9fca7a06/attachment.html>

From lists at cheimes.de  Thu Jan 10 20:04:31 2008
From: lists at cheimes.de (Christian Heimes)
Date: Thu, 10 Jan 2008 20:04:31 +0100
Subject: [Python-ideas] o(n**2) problem with marshal.dumps for large
 objects, with patch
In-Reply-To: <fc13a6500801101037m68fcbfaeyd10a5881788ad58c@mail.gmail.com>
References: <fc13a6500801101037m68fcbfaeyd10a5881788ad58c@mail.gmail.com>
Message-ID: <47866C3F.90709@cheimes.de>

Aaron Watters wrote:
> Hi  folks.  Much to my surprise I found that one of
> my applications seemed to be running slow as a result
> of marshal.dumps.  I think the culprit is the w_more(...)
> function, which grows the marshal buffer in 1k units.
> This means that a marshal of size 100k will have 100
> reallocations and string copies.  Other parts of Python
> (and java etc) have a proportional reallocation strategy
> which reallocates a new size based on the existing size.
> This mean a 100k marshal requires just 5 or so
> reallocations and string copies (n log n versus n**2
> asymptotic performance).
> 
> I humbly submit the following patch (based on python 2.6a0
> source).  I think it is a strict improvement on the existing
> code, but I've been wrong before (twice ;)).

Could you please submit the patch at http://bugs.python.org/ ? Patches
in mailing list posts have a tendency to get lost. An unified diff (diff
-u) is preferred.

Thanks

Christian



From castironpi at comcast.net  Fri Jan 11 09:54:41 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Fri, 11 Jan 2008 02:54:41 -0600
Subject: [Python-ideas] Thread object
Message-ID: <20080111085444.92E881E400A@bag.python.org>

Hey Group:

I find myself using Threads quite a bit, and I so often write:
	bContinue= True
And in the thread:
	while bContinue:
And elsewhere:
	if ...: bContinue= False

Proposing for the standard library:

	th= threading.Thread( target= myfunction )

	def myfunction( selfth ):
		while selfth.bContinue:
			...

and

	th.bContinue= False.

Of course, there is a little more functionality that's very common as well.
Throw it in, touched up from above:

	def pausesleep( self, secs ):
		'''Pauses until 'stopped' is set, 'unpaused' is set
(currently
		neglects to account for remaining unelapsed seconds),
		or 'secs' seconds elapses.  Set 'stopped' and 'unpaused'
		with Stop(), Pause(), and Resume().  Also, Going() returns
		not stopped.isSet().'''

		self.stopped.wait( secs )
		self.unpaused.wait() #secs- timeout or sth.

Additional; further proposing:
	def Copy( self, runimm= None, name= None ),
	def StopAll(),
	def JoinAll(),
	def PauseAll(),
	def ResumeAll().

However quote the docs: 'group should be None; reserved for future extension
when a ThreadGroup class is implemented', so this may be under way.

Current implementation (read: working version) available and offered to
post.

Thank you and sincerely,
Aaron



From facundobatista at gmail.com  Fri Jan 11 12:51:28 2008
From: facundobatista at gmail.com (Facundo Batista)
Date: Fri, 11 Jan 2008 09:51:28 -0200
Subject: [Python-ideas] o(n**2) problem with marshal.dumps for large
	objects, with patch
In-Reply-To: <47866C3F.90709@cheimes.de>
References: <fc13a6500801101037m68fcbfaeyd10a5881788ad58c@mail.gmail.com>
	<47866C3F.90709@cheimes.de>
Message-ID: <e04bdf310801110351s35d36449pf80b3d0f60e2d230@mail.gmail.com>

2008/1/10, Christian Heimes <lists at cheimes.de>:

> Could you please submit the patch at http://bugs.python.org/ ? Patches
> in mailing list posts have a tendency to get lost. An unified diff (diff
> -u) is preferred.

Please, include also an small example that shows this improvement, so
it can actually be measured (some lines of Python code that takes long
at first, but small time after the patch).

Thank you!!

-- 
.    Facundo

Blog: http://www.taniquetil.com.ar/plog/
PyAr: http://www.python.org/ar/


From castironpi at comcast.net  Fri Jan 11 08:43:48 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Fri, 11 Jan 2008 01:43:48 -0600
Subject: [Python-ideas] Thread object
Message-ID: <20080111074351.91DEC1E400A@bag.python.org>

Hey Group:

I find myself using Threads quite a bit, and I so often write:
	bContinue= True
And in the thread:
	while bContinue:
And elsewhere:
	if ...: bContinue= False

Proposing for the standard library:

	th= threading.Thread( target= myfunction )

	def myfunction( selfth ):
		while selfth.bContinue:
			...

and

	th.bContinue= False.

Of course, there is a little more functionality that's very common as well.
Throw it in, touched up from above:

	def pausesleep( self, secs ):
		'''Pauses until 'stopped' is set, 'unpaused' is set
(currently
		neglects to account for remaining unelapsed seconds),
		or 'secs' seconds elapses.  Set 'stopped' and 'unpaused'
		with Stop(), Pause(), and Resume().  Also, Going() returns
		not stopped.isSet().'''

		self.stopped.wait( secs )
		self.unpaused.wait() #secs- timeout or sth.

Additional; further proposing:
	def Copy( self, runimm= None, name= None ),
	def StopAll(),
	def JoinAll(),
	def PauseAll(),
	def ResumeAll().

However quote the docs: 'group should be None; reserved for future extension
when a ThreadGroup class is implemented', so this may be under way.

Current implementation (read: working version) available and offered to
post.

Thank you and sincerely,
Aaron



From castironpi at comcast.net  Sat Jan 12 21:03:44 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Sat, 12 Jan 2008 14:03:44 -0600
Subject: [Python-ideas] generic Ref class
Message-ID: <20080112201059.843CE1E400F@bag.python.org>

For iteration and pass-by-reference semantics:

	class Ref:
		def __init__( self, val):
			self.val= val

To modify list during iteration:

	def refs( iterable ):
		return [ Ref( x ) for x in iterable ]

Then:

	for x in refs( listA ):
		x.val+= 1

Better than:

	for i, x in enumerate( listA ):
		listA[i]= x+ 1

Example is naturally trivial; imagine.  No bloat here; get over it.



From adam at atlas.st  Sat Jan 12 21:16:02 2008
From: adam at atlas.st (Adam Atlas)
Date: Sat, 12 Jan 2008 15:16:02 -0500
Subject: [Python-ideas] generic Ref class
In-Reply-To: <20080112201059.843CE1E400F@bag.python.org>
References: <20080112201059.843CE1E400F@bag.python.org>
Message-ID: <8C3E6EE5-6900-4035-B228-C840DB4E0620@atlas.st>


On 12 Jan 2008, at 15:03, Aaron Brady wrote:
> Then:
>
> 	for x in refs( listA ):
> 		x.val+= 1
>
> Better than:
>
> 	for i, x in enumerate( listA ):
> 		listA[i]= x+ 1

Don't these do different things? The latter modifies the original  
list, while the former, with your Ref class, apparently modifies (in a  
by-reference sense) a new list that is thrown away once the for loop  
is done.


From castironpi at comcast.net  Sat Jan 12 21:21:47 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Sat, 12 Jan 2008 14:21:47 -0600
Subject: [Python-ideas] generic Ref class
In-Reply-To: <8C3E6EE5-6900-4035-B228-C840DB4E0620@atlas.st>
Message-ID: <20080112202159.5C0991E4012@bag.python.org>

> -----Original Message-----
> From: python-ideas-bounces+castironpi=comcast.net at python.org
> [mailto:python-ideas-bounces+castironpi=comcast.net at python.org] On Behalf
> Of Adam Atlas
> 
> On 12 Jan 2008, at 15:03, Aaron Brady wrote:
> > Then:
> >
> > 	for x in refs( listA ):
> > 		x.val+= 1
> >
> > Better than:
> >
> > 	for i, x in enumerate( listA ):
> > 		listA[i]= x+ 1
> 
> Don't these do different things? The latter modifies the original
> list, while the former, with your Ref class, apparently modifies (in a
> by-reference sense) a new list that is thrown away once the for loop
> is done.

Ah yes.  Say:

	listA= refs( range( 20 ) ) #or your list

then:

	for x in listA:
		x.val+= 1

Slightly slower, but useful in addition to pass to functions too:

	def squareanint( intref ):
		intref.val**= 2

	a= 2
	squareanint( a )
	print a




From castironpi at comcast.net  Sat Jan 12 21:25:55 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Sat, 12 Jan 2008 14:25:55 -0600
Subject: [Python-ideas] generic Ref class
In-Reply-To: <20080112202159.5C0991E4012@bag.python.org>
Message-ID: <20080112202608.617171E400F@bag.python.org>

> -----Original Message-----
> From: python-ideas-bounces at python.org [mailto:python-ideas-
> bounces at python.org] On Behalf Of Aaron Brady
> 
> > -----Original Message-----
> > From: python-ideas-bounces+castironpi=comcast.net at python.org
> > [mailto:python-ideas-bounces+castironpi=comcast.net at python.org] On
> Behalf
> > Of Adam Atlas
> >
> > On 12 Jan 2008, at 15:03, Aaron Brady wrote:
> > > Then:
> > >
> > > 	for x in refs( listA ):
> > > 		x.val+= 1
> > >
> > > Better than:
> > >
> > > 	for i, x in enumerate( listA ):
> > > 		listA[i]= x+ 1
> >
> > Don't these do different things? The latter modifies the original
> > list, while the former, with your Ref class, apparently modifies (in a
> > by-reference sense) a new list that is thrown away once the for loop
> > is done.
> 
> Ah yes.  Say:
> 
> 	listA= refs( range( 20 ) ) #or your list
> 
> then:
> 
> 	for x in listA:
> 		x.val+= 1
> 
> Slightly slower, but useful in addition to pass to functions too:
> 
> 	def squareanint( intref ):
> 		intref.val**= 2
> 
> 	a= 2
> 	squareanint( a )
> 	print a

And once again, I typo:

 	a= 2
 	squareanint( Ref( a ) )
 	print a




From castironpi at comcast.net  Sat Jan 12 21:28:37 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Sat, 12 Jan 2008 14:28:37 -0600
Subject: [Python-ideas] generic Ref class
Message-ID: <20080112202849.7B9041E400F@bag.python.org>

> -----Original Message-----
> From: Aaron Brady [mailto:castironpi at comcast.net]
> Sent: Saturday, January 12, 2008 2:26 PM
> To: 'python-ideas at python.org'
> Subject: RE: [Python-ideas] generic Ref class
> 
> > -----Original Message-----
> > From: python-ideas-bounces at python.org [mailto:python-ideas-
> > bounces at python.org] On Behalf Of Aaron Brady
> >
> > > -----Original Message-----
> > > From: python-ideas-bounces+castironpi=comcast.net at python.org
> > > [mailto:python-ideas-bounces+castironpi=comcast.net at python.org] On
> > Behalf
> > > Of Adam Atlas
> > >
> > > On 12 Jan 2008, at 15:03, Aaron Brady wrote:
> > > > Then:
> > > >
> > > > 	for x in refs( listA ):
> > > > 		x.val+= 1
> > > >
> > > > Better than:
> > > >
> > > > 	for i, x in enumerate( listA ):
> > > > 		listA[i]= x+ 1
> > >
> > > Don't these do different things? The latter modifies the original
> > > list, while the former, with your Ref class, apparently modifies (in a
> > > by-reference sense) a new list that is thrown away once the for loop
> > > is done.
> >
> > Ah yes.  Say:
> >
> > 	listA= refs( range( 20 ) ) #or your list
> >
> > then:
> >
> > 	for x in listA:
> > 		x.val+= 1
> >
> > Slightly slower, but useful in addition to pass to functions too:
> >
> > 	def squareanint( intref ):
> > 		intref.val**= 2
> >
> > 	a= 2
> > 	squareanint( a )
> > 	print a
> 
> And once again, I typo:
> 
>  	a= 2
>  	squareanint( Ref( a ) )
>  	print a

Sadly, no good.

	def squareanint( intref ):
		intref.val**= 2
	a= Ref( 3 )
	squareanint( a )
	assert a.val== 9

Like I said, bulky but very handy some times.  It's in -my- library...



From aahz at pythoncraft.com  Sat Jan 12 21:51:09 2008
From: aahz at pythoncraft.com (Aahz)
Date: Sat, 12 Jan 2008 12:51:09 -0800
Subject: [Python-ideas] generic Ref class
In-Reply-To: <20080112201059.843CE1E400F@bag.python.org>
References: <20080112201059.843CE1E400F@bag.python.org>
Message-ID: <20080112205109.GA2424@panix.com>

On Sat, Jan 12, 2008, Aaron Brady wrote:
>
> For iteration and pass-by-reference semantics:
> 
> 	class Ref:
> 		def __init__( self, val):
> 			self.val= val

Good addition to the cookbook if it's not already there.  IMO, not
appropriate for the standard library.
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

Weinberg's Second Law: If builders built buildings the way programmers wrote 
programs, then the first woodpecker that came along would destroy civilization.


From castironpi at comcast.net  Sat Jan 12 23:43:24 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Sat, 12 Jan 2008 16:43:24 -0600
Subject: [Python-ideas] generic Ref class
In-Reply-To: <20080112205109.GA2424@panix.com>
Message-ID: <20080112224338.9C2401E400F@bag.python.org>

> -----Original Message-----
> From: python-ideas-bounces at python.org [mailto:python-ideas-
> bounces at python.org] On Behalf Of Aahz
> 
> On Sat, Jan 12, 2008, Aaron Brady wrote:
> >
> > For iteration and pass-by-reference semantics:
> >
> > 	class Ref:
> > 		def __init__( self, val):
> > 			self.val= val
> 
> Good addition to the cookbook if it's not already there.  IMO, not
> appropriate for the standard library.

That would be your prerogative.  But ah, cookbook: I like the sound of it.



From castironpi at comcast.net  Sun Jan 13 02:20:49 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Sat, 12 Jan 2008 19:20:49 -0600
Subject: [Python-ideas] sharedmutable locking solution
Message-ID: <20080113012100.CF4771E401B@bag.python.org>

Practical concurrency solution attached and produced below.  Includes worked
example.  Brainstorm attached for referencing.

Excerpt:

1.  Put a single thread in charge
2.  Lock mutation and/or all operations
3.  @mutation decorator
4.  List-only solution
5.  @onfulfill decorator
6.  Ideal solution (implemented)
      Timing:     Important   Unimportant
      Return value:
      Ignore      (i.)        (ii.)
      Use         (i.)        (iii.)

sharedmutable.py
---
from __future__ import with_statement
import threading
import thread
from Queue import Queue
class Ref:
    '''To adjust a value a reference to which is
    not available.'''
    def __init__( self, val= None ): self.val= val
    def __repr__( self ): return repr( self.val )

class LockerItem( object ):
    def __call__( self, farevent ):
        raise NotImplemented

class LockerItemBlock( LockerItem ):
    '''Turn-taking Locker item.  Call frees its
    event, frees and waits on its parameter (must
    have 'clear' and 'wait' methods.  Use 'wait'
    to block for Call to set event, and block
    calling event.'''
    def __init__( self ):
        self.event= threading.Event()
    def __call__( self, farevent ):
        farevent.clear()
        self.event.set()
    def wait( self, timeout= None ):
        self.event.wait( timeout )

class LockerItemFaF( LockerItem ):
    '''Fire-and-Forget Locker item; calling just calls
    func.  Use to polymorphisize LockerItemBlock.'''
    def __init__( self, func, *ar, **kwar ):
        self.func, self.ar, self.kwar= func, ar, kwar
        self.ret= None
    def __call__( self, _ ):
        self.ret= self.func( *self.ar, **self.kwar )

class Locker( object ):
    '''Synchronization class.  Operations can honor:
        i.  Fire, block for completion (timing
            important: do here.)
        ii. Fire, launch new thread with result upon
            completion (timing unimportant, return
            important: do something with result later,
            in turn.)
        iii.    Fire, forget, continue (timing & return
            ignored: just do.)

        Timing:     Important   Unimportant
        Return value:
        Ignore      (i.)        (ii.)
        Use         (i.)        (iii.)
    Corresponding instance functions are:
        i.  Locker.op( operation & args )
            -or-
            Locker.acq()
            ...code block...
            Locker.rel()
            -or-
            with Locker:
                ...code block...
        ii. Decorator or otherwise:
            Locker.onfulfill( operation & args )
            ( oncompletioncallable )
        iii.    Fire-and-forget (faf) + options:
            Locker.faf( operation & args )
    To accomplish this, a Locker instance has a
    Queue of callables, and -produces- LockerItemFaF
    (fire-and-forget) and LockerItemBlock instances,
    which are enqueued and granted in order received.
    acq() and op() "get the caller in line" for
    ownership; faf() gets something else in line: in
    particular, a LockerItemFaF instance.

    A LockerItem subclass instance is called when
    "it's its turn" in Queue, with 'noget' as the
    parameter; '_mainth', started in initialization,
    blocks on 'noget' after each 'get' operation.
    instance.rel() clears 'noget', and
    LockerItemFaF.__call__ does not set it.
    (LockerItemBlock.__call__ does; acq() enqueues
    LockerItemBlock instances.)

    Usage:

    "with 'instance': (suite)" is equivalent to:
        instance.acq()
        (suite)
        instance.rel()
    (suite) may not call acq().
    'op' can be used for simple-statement suites.
        instance.op( listA.append, None )
        instance.op( listA.reverse )
    
    Decorator 'instance.onfulfill' spawns a new
    thread, whence, when it's its turn, it calls
    its decoratee with the result of the operation
    as the only parameter.
        @instance.onfulfull( dequeA.popleft )
        def onfulfill1( result ):
            #do something with the popped value
            print '%s was popped\n'% result,

    Use @ThreadUtils.ThFork for something complex.
        @ThreadUtils.ThFork
        def fork1():
            with instance:
                #something complex
                print listA
    '''
    def __init__( self ):
        self.queue= Queue()
        self.noget= threading.Event()
        self.thread= th= threading.Thread( target= self._mainth )
        th.setDaemon( True )
        th.start()
    def _mainth( self ):
        self.noget.clear()
        while 1:
            item= self.queue.get()
            item( self.noget )
            self.noget.wait()
    def acq( self ):
        item= LockerItemBlock()
        self.queue.put( item )
        item.wait()
        return item
    def rel( self ):
        self.noget.set()
    def op( self, func, *ar, **kwar ):
        self.acq()
        ret= func( *ar, **kwar )
        self.rel()
        return ret
    def __enter__( self ):
        return self.acq()
    def __exit__( self, *excs ):
        self.rel()
    def onfulfill( self, func, *ar, **kwar ):
        '''decorator launches the decoratee in separate
        thread upon completion of func.'''
        locfunc= Ref()
        def callee():
            result= self.op( func, *ar, **kwar )
            locfunc.val( result )
        def prefulfill( func ):
            locfunc.val= func
            th= threading.Thread( target= callee )
            th.start()
        return prefulfill
    def faf( self, func, *ar, **kwar ):
        '''fire and forget'''
        return self.fafroot( False, None, func, *ar, **kwar )
    def fafoption( self, func, *ar, **kwar ):
        return self.fafroot( True, None, func, *ar, **kwar )
    def faftimeout( self, timeout, func, *ar, **kwar ):
        return self.fafroot( False, timeout, func, *ar, **kwar )
    def fafroot( self, block, timeout, func, *ar, **kwar ):
        item= LockerItemFaF( func, *ar, **kwar )
        self.queue.put( item, block, timeout )
        return item

if __name__== '__main__':
    '''Thing to notice: fulfill1 prints a >0 value
    when thTry producers are adding integers in the
    closed interval [300,600].  By design there is a
    small chance of assertion failure, unobserved yet.'''
    import time
    import random
    counter= 0
    def simplecounter():
        global counter
        ret= counter
        counter+= 1
        time.sleep( random.uniform( 0, 0.01 ) )
        return counter
    listA= []
    lockerA= Locker()
    def thTry():
        while 1:
            with lockerA:
                ret= simplecounter()
                listA.append( ret )
                print ret,
                '''this assertion fails outside of locker with.'''
                assert all( [ listA[i]< listA[i+1] 
                    for i in range( len( listA )- 1 ) ] )
            if random.random()< 0.8 or len( listA )> 10:
                '''fire-and-forget example.  80% chance of
                removing an element (hence may fail), and 100%
                if listA has 'a lot' of elements.'''
                lockerA.faf( listA.pop, 0 )
            '''return is important on this one; must block for.'''
            ret= list( lockerA.op( reversed, listA ) )
            if len( ret )> 1:
                assert all( [ ret[i]> ret[i+1] 
                    for i in range( len( ret )- 1 ) ] )
            if random.random()< .05:
                '''return is important, but timing is not.'''
                @lockerA.onfulfill( set, listA )
                def fulfill1( result ):
                    count= 0
                    for si in result:
                        if 300<= si<= 600:
                            count+= 1
                    print "\n\t%i counts in [300,600]\n"% count,
    def thInterfere():
        while 1:
            with lockerA:
                '''remove a multiple of 2 from somewhere.'''
                ret= None
                for i in range( len( listA ) ):
                    if listA[ i ]% 2== 0:
                        ret= listA.pop( i )
                        break
            if ret is not None:
                assert ret% 2== 0
                print '\n\tremoved %i\n'% ret,
            time.sleep( 0.5 )
    def thMon():
        while 1:
            print '\n\t%s\n'% listA,
            time.sleep( 0.5 )
    thread.start_new_thread( thMon, () )
    thread.start_new_thread( thInterfere, () )
    for _ in range( 10 ):
        '''start a bunch of producer threads.'''
        thread.start_new_thread( thTry, () )
        time.sleep( 0.1 )
    '''and wait.'''
    time.sleep( 1000 )
-------------- next part --------------
A non-text attachment was scrubbed...
Name: sharedmutable.zip
Type: application/octet-stream
Size: 3833 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20080112/60509819/attachment.obj>

From castironpi at comcast.net  Mon Jan 14 05:26:53 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Sun, 13 Jan 2008 22:26:53 -0600
Subject: [Python-ideas] other thread exception
Message-ID: <20080114042706.60FCF1E4028@bag.python.org>

Raise an exception in another thread.



From guido at python.org  Thu Jan 17 01:31:52 2008
From: guido at python.org (Guido van Rossum)
Date: Wed, 16 Jan 2008 16:31:52 -0800
Subject: [Python-ideas] Default decorator?
Message-ID: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>

Peter Norvig suggested an idea to me over lunch: a "default
decorator". This would be something you could set once per module and
it would be invoked for each function definition in the same way as a
decorator is invoked, before any explicit decorators. His use case was
something that wraps every function that uses argument annotations
with something that interprets those annotations in a certain way and
enforces that interpretation. It would save having to explicitly
annotate every function or method that way.

Thoughts? I suggested that a metaclass could do this for methods, but
that of course leaves plain functions in the lurch.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From castironpi at comcast.net  Thu Jan 17 01:37:41 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Wed, 16 Jan 2008 18:37:41 -0600
Subject: [Python-ideas] Default decorator?
In-Reply-To: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
Message-ID: <20080117004458.AE8FA1E400A@bag.python.org>

> -----Original Message-----
> From: python-ideas-bounces at python.org [mailto:python-ideas-
> bounces at python.org] On Behalf Of Guido van Rossum
> Sent: Wednesday, January 16, 2008 6:32 PM
> 
> Peter Norvig suggested an idea to me over lunch: a "default
> decorator". This would be something you could set once per module and
> it would be invoked for each function definition in the same way as a
> decorator is invoked, before any explicit decorators. His use case was
> something that wraps every function that uses argument annotations
> with something that interprets those annotations in a certain way and
> enforces that interpretation. It would save having to explicitly
> annotate every function or method that way.

__decor__= functools.partial( "when's that?" )

@undecorated
def funcA():
	print 'no args.'





From brett at python.org  Thu Jan 17 01:58:40 2008
From: brett at python.org (Brett Cannon)
Date: Wed, 16 Jan 2008 16:58:40 -0800
Subject: [Python-ideas] Default decorator?
In-Reply-To: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
Message-ID: <bbaeab100801161658qfce5c59v745fe10034016c3c@mail.gmail.com>

On Jan 16, 2008 4:31 PM, Guido van Rossum <guido at python.org> wrote:
> Peter Norvig suggested an idea to me over lunch: a "default
> decorator". This would be something you could set once per module and
> it would be invoked for each function definition in the same way as a
> decorator is invoked, before any explicit decorators. His use case was
> something that wraps every function that uses argument annotations
> with something that interprets those annotations in a certain way and
> enforces that interpretation. It would save having to explicitly
> annotate every function or method that way.
>
> Thoughts? I suggested that a metaclass could do this for methods, but
> that of course leaves plain functions in the lurch.

Do we want something like a global decorator, or more something like
__build_class__ for the 'def' statement (although a quick attempt at
replacing __build_class__ didn't work for me)? I personally prefer the
latter as that is the more powerful solution between the two. Plus I
don't see the need for a global decorator coming up often enough to
warrant making it simpler.

-Brett


From lists at cheimes.de  Thu Jan 17 02:05:40 2008
From: lists at cheimes.de (Christian Heimes)
Date: Thu, 17 Jan 2008 02:05:40 +0100
Subject: [Python-ideas] Default decorator?
In-Reply-To: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
Message-ID: <478EA9E4.1090005@cheimes.de>

Guido van Rossum wrote:
> Peter Norvig suggested an idea to me over lunch: a "default
> decorator". This would be something you could set once per module and
> it would be invoked for each function definition in the same way as a
> decorator is invoked, before any explicit decorators. His use case was
> something that wraps every function that uses argument annotations
> with something that interprets those annotations in a certain way and
> enforces that interpretation. It would save having to explicitly
> annotate every function or method that way.
> 
> Thoughts? I suggested that a metaclass could do this for methods, but
> that of course leaves plain functions in the lurch.

Is it really required? As it is trivial to write a meta class to
decorator all members of a class, it is also trivial to write a function
which accepts a module name and decorate all functions in the module.

def decorate_functions(decorator, modname):
    function = type(decorate_functions)
    namespace = sys.modules[modname].__dict__
    for name, obj in namespace.items():
        if isinstance(obj, function):
            namespace[name] = decorator(obj)

decorate_functions(somefunc, __name__)


Does a default decorator also decorate nested functions (function with
functions)? And does it also decorate class members or function created
with exec()? What about lambda?

Christian



From guido at python.org  Thu Jan 17 02:06:08 2008
From: guido at python.org (Guido van Rossum)
Date: Wed, 16 Jan 2008 17:06:08 -0800
Subject: [Python-ideas] Default decorator?
In-Reply-To: <bbaeab100801161658qfce5c59v745fe10034016c3c@mail.gmail.com>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
	<bbaeab100801161658qfce5c59v745fe10034016c3c@mail.gmail.com>
Message-ID: <ca471dc20801161706p2e005267q29ca9c8a33d3b346@mail.gmail.com>

On Jan 16, 2008 4:58 PM, Brett Cannon <brett at python.org> wrote:
>
> On Jan 16, 2008 4:31 PM, Guido van Rossum <guido at python.org> wrote:
> > Peter Norvig suggested an idea to me over lunch: a "default
> > decorator". This would be something you could set once per module and
> > it would be invoked for each function definition in the same way as a
> > decorator is invoked, before any explicit decorators. His use case was
> > something that wraps every function that uses argument annotations
> > with something that interprets those annotations in a certain way and
> > enforces that interpretation. It would save having to explicitly
> > annotate every function or method that way.
> >
> > Thoughts? I suggested that a metaclass could do this for methods, but
> > that of course leaves plain functions in the lurch.
>
> Do we want something like a global decorator, or more something like
> __build_class__ for the 'def' statement (although a quick attempt at
> replacing __build_class__ didn't work for me)?

Wasn't __build_class__ only in an early version of PEP 3115?

> I personally prefer the
> latter as that is the more powerful solution between the two. Plus I
> don't see the need for a global decorator coming up often enough to
> warrant making it simpler.

The use case might come up a lot more if people take the argument
annotations idea and run with it. They might want to add lots of
annotations to lots of functions, and then have one convenient way to
ensure all those functions are wrapped by something that actually uses
the annotations for something (e.g. adaptation or type checking).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From castironpi at comcast.net  Thu Jan 17 02:13:52 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Wed, 16 Jan 2008 19:13:52 -0600
Subject: [Python-ideas] Default decorator?
In-Reply-To: <478EA9E4.1090005@cheimes.de>
Message-ID: <20080117011402.BB7E91E400A@bag.python.org>

> -----Original Message-----
> From: python-ideas-bounces at python.org [mailto:python-ideas-
> bounces at python.org] On Behalf Of Christian Heimes
> Does a default decorator also decorate nested functions (function with
> functions)? And does it also decorate class members or function created
> with exec()? What about lambda?

__decor__= functools.partial( "when?" )
__lambdadec__= functools.partial( "say when." )
__classdec__= functools.partial( "now." )
__classmethoddec__= functools.partial( "ooh.  too much." )

or:

from functools import partial
__decor__= funcutils.ModuleDec( partial( "where?" ), lambdadec= partial(
"there." ), classmethdec= partial( "never." ) )




From lists at cheimes.de  Thu Jan 17 02:14:18 2008
From: lists at cheimes.de (Christian Heimes)
Date: Thu, 17 Jan 2008 02:14:18 +0100
Subject: [Python-ideas] Default decorator?
In-Reply-To: <ca471dc20801161706p2e005267q29ca9c8a33d3b346@mail.gmail.com>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>	<bbaeab100801161658qfce5c59v745fe10034016c3c@mail.gmail.com>
	<ca471dc20801161706p2e005267q29ca9c8a33d3b346@mail.gmail.com>
Message-ID: <478EABEA.1050502@cheimes.de>

Guido van Rossum wrote:
> Wasn't __build_class__ only in an early version of PEP 3115?

It's still in builtins as builtins.__build_class__ and it's used by
ceval for LOAD_BUILD_CLASS.

> The use case might come up a lot more if people take the argument
> annotations idea and run with it. They might want to add lots of
> annotations to lots of functions, and then have one convenient way to
> ensure all those functions are wrapped by something that actually uses
> the annotations for something (e.g. adaptation or type checking).

Right, the use cases will pop up once people start to use annotations a
lot. But does the use case really satisfy a new hook? Isn't a meta class
factory and a simple function enough?

Christian



From guido at python.org  Thu Jan 17 02:18:34 2008
From: guido at python.org (Guido van Rossum)
Date: Wed, 16 Jan 2008 17:18:34 -0800
Subject: [Python-ideas] Default decorator?
In-Reply-To: <20080117011402.BB7E91E400A@bag.python.org>
References: <478EA9E4.1090005@cheimes.de>
	<20080117011402.BB7E91E400A@bag.python.org>
Message-ID: <ca471dc20801161718q42540374ne42640e2af3eb800@mail.gmail.com>

On Jan 16, 2008 5:13 PM, Aaron Brady <castironpi at comcast.net> wrote:
> __decor__= functools.partial( "when?" )
> __lambdadec__= functools.partial( "say when." )
> __classdec__= functools.partial( "now." )
> __classmethoddec__= functools.partial( "ooh.  too much." )
>
> or:
>
> from functools import partial
> __decor__= funcutils.ModuleDec( partial( "where?" ), lambdadec= partial(
> "there." ), classmethdec= partial( "never." ) )

Aaron, if you aren't capable of expression your thoughts in English,
you might as well not bother to post.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From tony at PageDNA.com  Thu Jan 17 02:16:37 2008
From: tony at PageDNA.com (Tony Lownds)
Date: Wed, 16 Jan 2008 17:16:37 -0800
Subject: [Python-ideas] Default decorator?
In-Reply-To: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
Message-ID: <4A420910-B0A8-4457-B322-3D9FCD303D36@PageDNA.com>

On Jan 16, 2008, at 4:31 PM, Guido van Rossum wrote:

> Peter Norvig suggested an idea to me over lunch: a "default
> decorator". This would be something you could set once per module and
> it would be invoked for each function definition in the same way as a
> decorator is invoked, before any explicit decorators. His use case was
> something that wraps every function that uses argument annotations
> with something that interprets those annotations in a certain way and
> enforces that interpretation. It would save having to explicitly
> annotate every function or method that way.
>
> Thoughts? I suggested that a metaclass could do this for methods, but
> that of course leaves plain functions in the lurch.

Great idea. +1 on a "default decorator".

Like the global __metaclass__, I'd hope that it could apply to functions
nested in classes, but not functions nested in functions.

A lot of the same reasons for __metaclass__ apply to __decorator__:

"The potential uses for metaclasses are boundless. Some ideas that have
been explored including logging, interface checking, automatic  
delegation,
automatic property creation, proxies, frameworks, and automatic resource
locking/synchronization."

http://docs.python.org/ref/metaclasses.html

-Tony


From ryan.freckleton at gmail.com  Thu Jan 17 02:33:04 2008
From: ryan.freckleton at gmail.com (Ryan Freckleton)
Date: Wed, 16 Jan 2008 18:33:04 -0700
Subject: [Python-ideas] Default decorator?
In-Reply-To: <478EA9E4.1090005@cheimes.de>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
	<478EA9E4.1090005@cheimes.de>
Message-ID: <318072440801161733q9e74796v83a5d3414c13539f@mail.gmail.com>

+1 from me. But I'm just a ignorant python end-user :-) I imagine it
would probably end up being used like the __metaclass__ module hook is
now.

On Jan 16, 2008 6:05 PM, Christian Heimes <lists at cheimes.de> wrote:
> Does a default decorator also decorate nested functions (function with
> functions)? And does it also decorate class members or function created
> with exec()? What about lambda?

The __metaclass__ hook currently changes the metaclass for nested
classes, the decorator hook could follow the same semantics:

>>> class Foo(type):
...     pass
...
>>> __metaclass__ = Foo
>>> class C:
...     class D:
...             pass
...
>>> type(C)
<class '__main__.Foo'>
>>> type(C.D)
<class '__main__.Foo'>

Lambdas don't currently support parameter annotations, so they should
be left out.

-- 
=====
--Ryan E. Freckleton


From lucio.torre at gmail.com  Thu Jan 17 03:15:22 2008
From: lucio.torre at gmail.com (Lucio Torre)
Date: Wed, 16 Jan 2008 23:15:22 -0300
Subject: [Python-ideas] Default decorator?
In-Reply-To: <318072440801161733q9e74796v83a5d3414c13539f@mail.gmail.com>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
	<478EA9E4.1090005@cheimes.de>
	<318072440801161733q9e74796v83a5d3414c13539f@mail.gmail.com>
Message-ID: <999187ed0801161815n64d20c68oa2778481b15a1dc@mail.gmail.com>

On Jan 16, 2008 10:33 PM, Ryan Freckleton <ryan.freckleton at gmail.com> wrote:

> +1 from me. But I'm just a ignorant python end-user :-) I imagine it
> would probably end up being used like the __metaclass__ module hook is
> now.
>

One problem is that you can override a module level __metaclass__ with a
class level __metaclass__, but I dont see how i could express this for
default decorators. How would i be able to say "all functions but this one"?
In that case, the default decorator is useless.

Lucio.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20080116/e6929ca5/attachment.html>

From phd at phd.pp.ru  Thu Jan 17 09:35:44 2008
From: phd at phd.pp.ru (Oleg Broytmann)
Date: Thu, 17 Jan 2008 11:35:44 +0300
Subject: [Python-ideas] Default decorator?
In-Reply-To: <478EA9E4.1090005@cheimes.de>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
	<478EA9E4.1090005@cheimes.de>
Message-ID: <20080117083544.GB1684@phd.pp.ru>

On Thu, Jan 17, 2008 at 02:05:40AM +0100, Christian Heimes wrote:
> it is also trivial to write a function
> which accepts a module name

   A module object, perhaps? Modules are first-class citizens, no need to
fetch them by name. Or have I missed something?

Oleg.
-- 
     Oleg Broytmann            http://phd.pp.ru/            phd at phd.pp.ru
           Programmers don't die, they just GOSUB without RETURN.


From arno at marooned.org.uk  Thu Jan 17 10:46:16 2008
From: arno at marooned.org.uk (Arnaud Delobelle)
Date: Thu, 17 Jan 2008 09:46:16 +0000
Subject: [Python-ideas] Default decorator?
In-Reply-To: <478EA9E4.1090005@cheimes.de>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
	<478EA9E4.1090005@cheimes.de>
Message-ID: <C149F97F-28E3-4AEC-938E-F3E2673707E8@marooned.org.uk>

python-ideas at python.org
On 17 Jan 2008, at 01:05, Christian Heimes wrote:

> Guido van Rossum wrote:
>> Peter Norvig suggested an idea to me over lunch: a "default
>> decorator". This would be something you could set once per module and
>> it would be invoked for each function definition in the same way as a
>> decorator is invoked, before any explicit decorators. His use case  
>> was
>> something that wraps every function that uses argument annotations
>> with something that interprets those annotations in a certain way and
>> enforces that interpretation. It would save having to explicitly
>> annotate every function or method that way.
>>
>> Thoughts? I suggested that a metaclass could do this for methods, but
>> that of course leaves plain functions in the lurch.
>
> Is it really required? As it is trivial to write a meta class to
> decorator all members of a class, it is also trivial to write a  
> function
> which accepts a module name and decorate all functions in the module.
>
> def decorate_functions(decorator, modname):
>   function = type(decorate_functions)
>   namespace = sys.modules[modname].__dict__
>   for name, obj in namespace.items():
>       if isinstance(obj, function):
>           namespace[name] = decorator(obj)
>
> decorate_functions(somefunc, __name__)
>

This doesn't work as decorators don't usually commute.  If the module  
contains a function:

@foo
def bar(...)

decorate_functions will change it to:

@somefunc
@foo
def bar(...)

Whereas the desired result is:

@foo
@somefunc
def bar(...)

-- 
Arnaud





From mark at qtrac.eu  Thu Jan 17 11:15:32 2008
From: mark at qtrac.eu (Mark Summerfield)
Date: Thu, 17 Jan 2008 10:15:32 +0000
Subject: [Python-ideas] Empty set {} and empty dict {:} ?
Message-ID: <200801171015.32729.mark@qtrac.eu>

I just wondered if changing the empty set & dict syntax might be
possible.

    s = {}  # empty set
    d = {:} # empty dict

I think this would be more understandable to newcomers since to create
single element ones we have:

    s = {1}
    d = {1:1}

?

-- 
Mark Summerfield, Qtrac Ltd., www.qtrac.eu



From jimjjewett at gmail.com  Thu Jan 17 16:16:29 2008
From: jimjjewett at gmail.com (Jim Jewett)
Date: Thu, 17 Jan 2008 10:16:29 -0500
Subject: [Python-ideas] Default decorator?
In-Reply-To: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
Message-ID: <fb6fbf560801170716k31dd49c6ud3cea6c4fc0198d7@mail.gmail.com>

On 1/16/08, Guido van Rossum <guido at python.org> wrote:
> Peter Norvig suggested ... "default decorator". ...
> set once per module and ... invoked, before any explicit decorators.
> His use case ... uses argument annotations

I would prefer that it be easier to control module creation in general.

The decorator does meet his use case (and
logging/tracing/registration), but there are varieties that would work
better if the decorator were applied after any explicit decorators.
(OTOH, doing that all the time raises the problem about whether
decorators commute.)  There are already questions about whether it
should apply to nested functions or lambdas.

You can just settle those questions by fiat and meet 90% of the need,
but letting the module control it seems cleaner (and also allows other
extensions, such as alternate module dict representations).

As to how module creation should be controlled -- that is a bit
hairier.  My strawman is a __moduleclass__ analogous to __metaclass__.

myfile.py:
    __moduleclass__=altmod.mod1

would be the moral equivalent of (some other file) running

    import sys
    import altmod
    sys.modules["myfile"]=altmod.mod1()
    exec myfile in sys.modules["myfile"]


-jJ


From aahz at pythoncraft.com  Thu Jan 17 17:10:20 2008
From: aahz at pythoncraft.com (Aahz)
Date: Thu, 17 Jan 2008 08:10:20 -0800
Subject: [Python-ideas] Default decorator?
In-Reply-To: <fb6fbf560801170716k31dd49c6ud3cea6c4fc0198d7@mail.gmail.com>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
	<fb6fbf560801170716k31dd49c6ud3cea6c4fc0198d7@mail.gmail.com>
Message-ID: <20080117161020.GB2660@panix.com>

On Thu, Jan 17, 2008, Jim Jewett wrote:
> 
> I would prefer that it be easier to control module creation in general.

Right.  I haven't been paying much attention, but can the post-import
hook play nicely with this?
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"All problems in computer science can be solved by another level of     
indirection."  --Butler Lampson


From guido at python.org  Thu Jan 17 18:35:37 2008
From: guido at python.org (Guido van Rossum)
Date: Thu, 17 Jan 2008 09:35:37 -0800
Subject: [Python-ideas] Default decorator?
In-Reply-To: <fb6fbf560801170716k31dd49c6ud3cea6c4fc0198d7@mail.gmail.com>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
	<fb6fbf560801170716k31dd49c6ud3cea6c4fc0198d7@mail.gmail.com>
Message-ID: <ca471dc20801170935w5e3cf11ct13d2b9997479151d@mail.gmail.com>

Quick response: this was already discussed, the requirement is for
this to happen to *any* def or lambda *before* applying decorators and
*regardless* of where/how they are nested (e.g. methods, nested
functions, etc.).

--Guido

On Jan 17, 2008 7:16 AM, Jim Jewett <jimjjewett at gmail.com> wrote:
> On 1/16/08, Guido van Rossum <guido at python.org> wrote:
> > Peter Norvig suggested ... "default decorator". ...
> > set once per module and ... invoked, before any explicit decorators.
> > His use case ... uses argument annotations
>
> I would prefer that it be easier to control module creation in general.
>
> The decorator does meet his use case (and
> logging/tracing/registration), but there are varieties that would work
> better if the decorator were applied after any explicit decorators.
> (OTOH, doing that all the time raises the problem about whether
> decorators commute.)  There are already questions about whether it
> should apply to nested functions or lambdas.
>
> You can just settle those questions by fiat and meet 90% of the need,
> but letting the module control it seems cleaner (and also allows other
> extensions, such as alternate module dict representations).
>
> As to how module creation should be controlled -- that is a bit
> hairier.  My strawman is a __moduleclass__ analogous to __metaclass__.
>
> myfile.py:
>     __moduleclass__=altmod.mod1
>
> would be the moral equivalent of (some other file) running
>
>     import sys
>     import altmod
>     sys.modules["myfile"]=altmod.mod1()
>     exec myfile in sys.modules["myfile"]
>
>
> -jJ
>



-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From lists at cheimes.de  Thu Jan 17 18:52:14 2008
From: lists at cheimes.de (Christian Heimes)
Date: Thu, 17 Jan 2008 18:52:14 +0100
Subject: [Python-ideas] Default decorator?
In-Reply-To: <20080117161020.GB2660@panix.com>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>	<fb6fbf560801170716k31dd49c6ud3cea6c4fc0198d7@mail.gmail.com>
	<20080117161020.GB2660@panix.com>
Message-ID: <fmo4ke$838$1@ger.gmane.org>

Aahz wrote:
> Right.  I haven't been paying much attention, but can the post-import
> hook play nicely with this?

Subclasses of the module class definitely work. Other types aren't well
tested yet but they will work, too.

Christian



From steven.bethard at gmail.com  Thu Jan 17 19:10:10 2008
From: steven.bethard at gmail.com (Steven Bethard)
Date: Thu, 17 Jan 2008 11:10:10 -0700
Subject: [Python-ideas] Empty set {} and empty dict {:} ?
In-Reply-To: <200801171015.32729.mark@qtrac.eu>
References: <200801171015.32729.mark@qtrac.eu>
Message-ID: <d11dcfba0801171010sbb325fdv9471b5dcf0188cc@mail.gmail.com>

On Jan 17, 2008 3:15 AM, Mark Summerfield <mark at qtrac.eu> wrote:
> I just wondered if changing the empty set & dict syntax might be
> possible.
>
>     s = {}  # empty set
>     d = {:} # empty dict
>
> I think this would be more understandable to newcomers since to create
> single element ones we have:
>
>     s = {1}
>     d = {1:1}
>
> ?

This was discussed when set literals were added.  It was decided that
it would be too much of a backwards incompatibility even for Python
3.0.  It may be worth re-considering for Python 4.0.  ;-)

Steve
-- 
I'm not *in*-sane. Indeed, I am so far *out* of sane that you appear a
tiny blip on the distant coast of sanity.
        --- Bucky Katt, Get Fuzzy


From george.sakkis at gmail.com  Thu Jan 17 19:23:45 2008
From: george.sakkis at gmail.com (George Sakkis)
Date: Thu, 17 Jan 2008 13:23:45 -0500
Subject: [Python-ideas] Empty set {} and empty dict {:} ?
In-Reply-To: <d11dcfba0801171010sbb325fdv9471b5dcf0188cc@mail.gmail.com>
References: <200801171015.32729.mark@qtrac.eu>
	<d11dcfba0801171010sbb325fdv9471b5dcf0188cc@mail.gmail.com>
Message-ID: <91ad5bf80801171023k12727ef3qeb836a134bdcd05b@mail.gmail.com>

On Jan 17, 2008 1:10 PM, Steven Bethard <steven.bethard at gmail.com> wrote:
> On Jan 17, 2008 3:15 AM, Mark Summerfield <mark at qtrac.eu> wrote:
> > I just wondered if changing the empty set & dict syntax might be
> > possible.
> >
> >     s = {}  # empty set
> >     d = {:} # empty dict
> >
> > I think this would be more understandable to newcomers since to create
> > single element ones we have:
> >
> >     s = {1}
> >     d = {1:1}
> >
> > ?
>
> This was discussed when set literals were added.  It was decided that
> it would be too much of a backwards incompatibility even for Python
> 3.0.  It may be worth re-considering for Python 4.0.  ;-)

How about {/} for empty sets ? (although chances are that this or
something similar has been discussed and rejected too :)).

George


From guido at python.org  Thu Jan 17 19:27:47 2008
From: guido at python.org (Guido van Rossum)
Date: Thu, 17 Jan 2008 10:27:47 -0800
Subject: [Python-ideas] Empty set {} and empty dict {:} ?
In-Reply-To: <91ad5bf80801171023k12727ef3qeb836a134bdcd05b@mail.gmail.com>
References: <200801171015.32729.mark@qtrac.eu>
	<d11dcfba0801171010sbb325fdv9471b5dcf0188cc@mail.gmail.com>
	<91ad5bf80801171023k12727ef3qeb836a134bdcd05b@mail.gmail.com>
Message-ID: <ca471dc20801171027g7159ab94u6aba0e6159c765a0@mail.gmail.com>

Right. All possible proposals have already been discussed at length.
Really, writing set() isn't so bad. Get used to it.

On Jan 17, 2008 10:23 AM, George Sakkis <george.sakkis at gmail.com> wrote:
> On Jan 17, 2008 1:10 PM, Steven Bethard <steven.bethard at gmail.com> wrote:
> > On Jan 17, 2008 3:15 AM, Mark Summerfield <mark at qtrac.eu> wrote:
> > > I just wondered if changing the empty set & dict syntax might be
> > > possible.
> > >
> > >     s = {}  # empty set
> > >     d = {:} # empty dict
> > >
> > > I think this would be more understandable to newcomers since to create
> > > single element ones we have:
> > >
> > >     s = {1}
> > >     d = {1:1}
> > >
> > > ?
> >
> > This was discussed when set literals were added.  It was decided that
> > it would be too much of a backwards incompatibility even for Python
> > 3.0.  It may be worth re-considering for Python 4.0.  ;-)
>
> How about {/} for empty sets ? (although chances are that this or
> something similar has been discussed and rejected too :)).
>
> George
>
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From g.brandl at gmx.net  Thu Jan 17 19:30:42 2008
From: g.brandl at gmx.net (Georg Brandl)
Date: Thu, 17 Jan 2008 19:30:42 +0100
Subject: [Python-ideas] Empty set {} and empty dict {:} ?
In-Reply-To: <91ad5bf80801171023k12727ef3qeb836a134bdcd05b@mail.gmail.com>
References: <200801171015.32729.mark@qtrac.eu>	<d11dcfba0801171010sbb325fdv9471b5dcf0188cc@mail.gmail.com>
	<91ad5bf80801171023k12727ef3qeb836a134bdcd05b@mail.gmail.com>
Message-ID: <fmo6o7$ij3$1@ger.gmane.org>

George Sakkis schrieb:
> On Jan 17, 2008 1:10 PM, Steven Bethard <steven.bethard at gmail.com> wrote:
>> On Jan 17, 2008 3:15 AM, Mark Summerfield <mark at qtrac.eu> wrote:
>> > I just wondered if changing the empty set & dict syntax might be
>> > possible.
>> >
>> >     s = {}  # empty set
>> >     d = {:} # empty dict
>> >
>> > I think this would be more understandable to newcomers since to create
>> > single element ones we have:
>> >
>> >     s = {1}
>> >     d = {1:1}
>> >
>> > ?
>>
>> This was discussed when set literals were added.  It was decided that
>> it would be too much of a backwards incompatibility even for Python
>> 3.0.  It may be worth re-considering for Python 4.0.  ;-)
> 
> How about {/} for empty sets ? (although chances are that this or
> something similar has been discussed and rejected too :)).

It has been discussed and rejected. :)

Georg



From lists at cheimes.de  Thu Jan 17 02:19:29 2008
From: lists at cheimes.de (Christian Heimes)
Date: Thu, 17 Jan 2008 02:19:29 +0100
Subject: [Python-ideas] Default decorator?
In-Reply-To: <ca471dc20801161711g770879fft5625593e87b2ab83@mail.gmail.com>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>	
	<478EA9E4.1090005@cheimes.de>
	<ca471dc20801161711g770879fft5625593e87b2ab83@mail.gmail.com>
Message-ID: <478EAD21.2010406@cheimes.de>



Guido van Rossum wrote:
> Those are indeed the main reasons for wanting something built-in
> rather than something that scans a module's namespace -- the idea
> being that within the scope of the declaration *every* def and lambda
> gets passed to this hook first (and the hook can pass on anything that
> doesn't have annotations, for example).

Ah! Ok ... I got it. It sounds like a job for a new builtin
__build_funcion__ which is called for the opcode MAKE_FUNCTION.

Christian



From lists at cheimes.de  Thu Jan 17 02:05:40 2008
From: lists at cheimes.de (Christian Heimes)
Date: Thu, 17 Jan 2008 02:05:40 +0100
Subject: [Python-ideas] Default decorator?
In-Reply-To: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>
Message-ID: <478EA9E4.1090005@cheimes.de>



Guido van Rossum wrote:
> Peter Norvig suggested an idea to me over lunch: a "default
> decorator". This would be something you could set once per module and
> it would be invoked for each function definition in the same way as a
> decorator is invoked, before any explicit decorators. His use case was
> something that wraps every function that uses argument annotations
> with something that interprets those annotations in a certain way and
> enforces that interpretation. It would save having to explicitly
> annotate every function or method that way.
> 
> Thoughts? I suggested that a metaclass could do this for methods, but
> that of course leaves plain functions in the lurch.

Is it really required? As it is trivial to write a meta class to
decorator all members of a class, it is also trivial to write a function
which accepts a module name and decorate all functions in the module.

def decorate_functions(decorator, modname):
    function = type(decorate_functions)
    namespace = sys.modules[modname].__dict__
    for name, obj in namespace.items():
        if isinstance(obj, function):
            namespace[name] = decorator(obj)

decorate_functions(somefunc, __name__)


Does a default decorator also decorate nested functions (function with
functions)? And does it also decorate class members or function created
with exec()? What about lambda?

Christian



From lists at cheimes.de  Thu Jan 17 02:14:18 2008
From: lists at cheimes.de (Christian Heimes)
Date: Thu, 17 Jan 2008 02:14:18 +0100
Subject: [Python-ideas] Default decorator?
In-Reply-To: <ca471dc20801161706p2e005267q29ca9c8a33d3b346@mail.gmail.com>
References: <ca471dc20801161631s1e30226cv80ba3e006d401cd0@mail.gmail.com>	<bbaeab100801161658qfce5c59v745fe10034016c3c@mail.gmail.com>
	<ca471dc20801161706p2e005267q29ca9c8a33d3b346@mail.gmail.com>
Message-ID: <478EABEA.1050502@cheimes.de>



Guido van Rossum wrote:
> Wasn't __build_class__ only in an early version of PEP 3115?

It's still in builtins as builtins.__build_class__ and it's used by
ceval for LOAD_BUILD_CLASS.

> The use case might come up a lot more if people take the argument
> annotations idea and run with it. They might want to add lots of
> annotations to lots of functions, and then have one convenient way to
> ensure all those functions are wrapped by something that actually uses
> the annotations for something (e.g. adaptation or type checking).

Right, the use cases will pop up once people start to use annotations a
lot. But does the use case really satisfy a new hook? Isn't a meta class
factory and a simple function enough?

Christian



From greg at krypto.org  Thu Jan 17 22:46:48 2008
From: greg at krypto.org (Gregory P. Smith)
Date: Thu, 17 Jan 2008 13:46:48 -0800
Subject: [Python-ideas] Empty set {} and empty dict {:} ?
In-Reply-To: <ca471dc20801171027g7159ab94u6aba0e6159c765a0@mail.gmail.com>
References: <200801171015.32729.mark@qtrac.eu>
	<d11dcfba0801171010sbb325fdv9471b5dcf0188cc@mail.gmail.com>
	<91ad5bf80801171023k12727ef3qeb836a134bdcd05b@mail.gmail.com>
	<ca471dc20801171027g7159ab94u6aba0e6159c765a0@mail.gmail.com>
Message-ID: <52dc1c820801171346w19d9a0fcx6233072f5c0ac479@mail.gmail.com>

On 1/17/08, Guido van Rossum <guido at python.org> wrote:
>
> Right. All possible proposals have already been discussed at length.
> Really, writing set() isn't so bad. Get used to it.


+10
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20080117/d0b81d22/attachment.html>

From castironpi at comcast.net  Mon Jan 21 01:14:19 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Sun, 20 Jan 2008 18:14:19 -0600
Subject: [Python-ideas] write to cell
Message-ID: <20080121001415.A3A2B1E402D@bag.python.org>

Decorator 'FreeVarArr', free variable array, returns an array of its
decoratee (decorated function), with free variables set by name to a range
of values.

Produced & attached.

import new
def cell_values( *cells ):
    return [ cell.cell_contents for cell in cells ]

def CellMake( *opts ):
    dumbopts= [ 'opt%i'% i for i in range( len( opts ) ) ]
    dumbvars= ', '.join( dumbopts )
    innermakefmt= "def innermake( %s ):\n\tdef toget(): %s\n\treturn toget"
    exec innermakefmt% ( dumbvars, dumbvars )
    innerfunc= innermake( *opts )
    freevars= list( innerfunc.func_code.co_freevars )
    freevarindices= map( freevars.index, dumbopts )
    return tuple( [ innerfunc.func_closure[i] for i in freevarindices ] )

def FreeVarArr( **kwargs ):
    vars= kwargs.keys()
    iters= [ kwargs[var] for var in vars]
    def newf( wfunc, closure ):
        return new.function( wfunc.func_code, wfunc.func_globals,
wfunc.func_name, wfunc.func_defaults, closure )
    def recall( wfunc ):
        freevars= list( wfunc.func_code.co_freevars )
        freeindices= [ freevars.index( var ) for var in vars ]
        iterzip= zip( *[ iters[i] for i in freeindices ] )
        closures= [ CellMake( *opts ) for opts in iterzip ]
        funcs= [ newf( wfunc, closure) for closure in closures ]
        return funcs
    return recall

if __name__== '__main__':
    print CellMake( 2, 3, 'abc' )
    a= CellMake( 2, 3, 'abc' )

    def who():
        some, someelse, moreelse, moreover= None, None, None, None
        @FreeVarArr( some= range(8,14), someelse= range(2,19), moreelse=
['a']*90, moreover= ['What!']*90 )
        def what():
            "doc string"
            print someelse, moreelse, moreover
            assert moreelse== 'a'
            assert moreover== 'What!'
            print "this prints", some #assign 'some' to range(8,14) in
successive array functions
            print "John says:", some+ someelse, moreover+ ' '+ moreelse
            return some
        return what
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: funcarray.py
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20080120/1371cf9f/attachment.ksh>

From carroll at tjc.com  Mon Jan 21 23:25:31 2008
From: carroll at tjc.com (Terry Carroll)
Date: Mon, 21 Jan 2008 14:25:31 -0800 (PST)
Subject: [Python-ideas] callback support for storxxxx methods in ftplib?
Message-ID: <Pine.LNX.4.44.0801211419340.26578-100000@violet.rahul.net>

What's the chance of having support added to the storbinary and storlines 
to support callback, just like retrbinary and retrlines has, for Python 
2.6?

There's a patch for this at http://bugs.python.org/issue1221598 but it 
hasn't seemed to generate much enthusiasm.

I've had at least one instance where I've needed this, and monkey-patched 
in this patch to get the functionality.  It's also just plain odd that the 
file transfer capability is not symmetrical; is there any reason why 
callback capability is considered important for downlaods, but not for 
uploads?





From brett at python.org  Tue Jan 22 00:22:56 2008
From: brett at python.org (Brett Cannon)
Date: Mon, 21 Jan 2008 15:22:56 -0800
Subject: [Python-ideas] callback support for storxxxx methods in ftplib?
In-Reply-To: <Pine.LNX.4.44.0801211419340.26578-100000@violet.rahul.net>
References: <Pine.LNX.4.44.0801211419340.26578-100000@violet.rahul.net>
Message-ID: <bbaeab100801211522o71bd315cne2c7a8bb7625ad23@mail.gmail.com>

On Jan 21, 2008 2:25 PM, Terry Carroll <carroll at tjc.com> wrote:
> What's the chance of having support added to the storbinary and storlines
> to support callback, just like retrbinary and retrlines has, for Python
> 2.6?
>
> There's a patch for this at http://bugs.python.org/issue1221598 but it
> hasn't seemed to generate much enthusiasm.
>
> I've had at least one instance where I've needed this, and monkey-patched
> in this patch to get the functionality.  It's also just plain odd that the
> file transfer capability is not symmetrical; is there any reason why
> callback capability is considered important for downloads, but not for
> uploads?

For something like this it requires a core developer to take enough
interest to spend their time and energy on it (I have not looked at
the patch but I assume it is done in a backwards-compatible fashion).
For new functionality like that means either personal interest of the
core developer or enough people saying they want the feature to
motivate a core developer to put the time in just to be nice.

-Brett


From mark at qtrac.eu  Wed Jan 23 09:59:54 2008
From: mark at qtrac.eu (Mark Summerfield)
Date: Wed, 23 Jan 2008 08:59:54 +0000
Subject: [Python-ideas] adopt an enum type for the standard library?
Message-ID: <200801230859.54917.mark@qtrac.eu>

AFAIK if you want enumerations you must either create your own, or use a
third party module, or use namedtuple:

    Files = collections.namedtuple("Files", "minimum maximum")(1, 200)
    ...
    x = Files.minimum

Using:

    MINIMUM, MAXIMUM = 1, 200

is often inconvenient, since you might have several different ones. Of
course you could do:

    MIN_FILES = 1
    MIN_DIRS = 0

Personally, I like enums and consider them to be a fundamental, but I
don't like the above approaches.

There is an enum module in PyPI 
http://pypi.python.org/pypi/enum/
and there are several versions in the Python Cookbook.

Wouldn't one of these be worth adopting for the standard library?

-- 
Mark Summerfield, Qtrac Ltd., www.qtrac.eu



From castironpi at comcast.net  Wed Jan 23 10:44:25 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Wed, 23 Jan 2008 03:44:25 -0600
Subject: [Python-ideas] An easier syntax for writing decorators (&
	similar things)?
Message-ID: <20080123094421.BFB421E401E@bag.python.org>

> -----Original Message-----
> On 8 Oct 2007, at 10:57, Arnaud Delobelle wrote:
> >
> > On Mon, October 8, 2007 4:33 am, Adam Atlas wrote:
> >> When writing decorators especially when it's one that needs arguments
> >> other than the function to be wrapped, it often gets rather ugly...
> [...]
> > Whhy not create a (meta-)decorator to do this? Something like:
> [...]

Following up post from 10/8/07.

> To follow up on my untested suggestion, here's one that is tested:
> 
> # This metadecorator hasn't changed
> 
> def decorator_withargs(decf):
>      def decorator(*args, **kwargs):
>          def decorated(f):
>              return decf(f, *args, **kwargs)
>          return decorated
>      return decorator

This is equivalent to:
(1)
      decorator_withargs= partial( partial, prepartial )

, where prepartial is roughly the same as partial as you might expect:

(2)
1     def prepartial(func, *args, **keywords):
2           def newfunc(*fargs, **fkeywords):
3               newkeywords = keywords.copy()
4               newkeywords.update(fkeywords)
5               return func(*(fargs+ args), **newkeywords)
6           newfunc.func = func
7           newfunc.args = args
8           newfunc.keywords = keywords
9           return newfunc

Partial is the same outside of line 5:

(3)
5               return func(*(args + fargs), **newkeywords)

Results are the same:

-> f
1
f -> 2
2

Intriguing.

> # Here's how to use it to create a decorator
> 
> @decorator_withargs
> def mydec(f, before='entering %s', after='%s returns %%s'):
>      before = before % f.__name__
>      after = after % f.__name__
>      def decorated(*args, **kwargs):
>          print before
>          result = f(*args, **kwargs)
>          print after % result
>          return result
>      return decorated
> 
> 
> # Now I can decorate a function with my new decorator
> 
> @mydec(before='-> %s', after='%s -> %%s')
> def f(x):
>      print x
>      return x+1
> 
> 
> Then
> 
>  >>> f(1)
> -> f
> 1
> f -> 2
> 2
> 
> --
> Arnaud



From lists at cheimes.de  Wed Jan 23 14:16:48 2008
From: lists at cheimes.de (Christian Heimes)
Date: Wed, 23 Jan 2008 14:16:48 +0100
Subject: [Python-ideas] adopt an enum type for the standard library?
In-Reply-To: <200801230859.54917.mark@qtrac.eu>
References: <200801230859.54917.mark@qtrac.eu>
Message-ID: <fn7eo0$t2n$1@ger.gmane.org>

Mark Summerfield wrote:
> There is an enum module in PyPI 
> http://pypi.python.org/pypi/enum/
> and there are several versions in the Python Cookbook.
> 
> Wouldn't one of these be worth adopting for the standard library?


It might be worth adding an enum to Python 2.6. I'm +0 on it.

The enum implementation from pypi is not sufficient for Python core. I
don't like its __cmp__ and __hash__ code. I also miss the feature to set
a start value or to skip values:

>>> enum = Enum("error=-1", "ok", "also_ok", "someother=1000", "last")
>>> enum.error
-1
>>> enum.ok
0
>>> enum.also_ok
1
>>> enum.someother
1000
>>> enum.last
1001

Christian



From taleinat at gmail.com  Wed Jan 23 15:36:26 2008
From: taleinat at gmail.com (Tal Einat)
Date: Wed, 23 Jan 2008 16:36:26 +0200
Subject: [Python-ideas] adopt an enum type for the standard library?
In-Reply-To: <fn7eo0$t2n$1@ger.gmane.org>
References: <200801230859.54917.mark@qtrac.eu> <fn7eo0$t2n$1@ger.gmane.org>
Message-ID: <7afdee2f0801230636sf91ebbk1896bd693db4e0bc@mail.gmail.com>

Christian Heimes wrote:
> Mark Summerfield wrote:
> > There is an enum module in PyPI
> > http://pypi.python.org/pypi/enum/
> > and there are several versions in the Python Cookbook.
> >
> > Wouldn't one of these be worth adopting for the standard library?
>
>
> It might be worth adding an enum to Python 2.6. I'm +0 on it.
>
> The enum implementation from pypi is not sufficient for Python core. I
> don't like its __cmp__ and __hash__ code. I also miss the feature to set
> a start value or to skip values:
>
> >>> enum = Enum("error=-1", "ok", "also_ok", "someother=1000", "last")
> >>> enum.error
> -1
> >>> enum.ok
> 0
> >>> enum.also_ok
> 1
> >>> enum.someother
> 1000
> >>> enum.last
> 1001
>
> Christian


-1 on adding a specific construct for enums.


What I usually do in Python is this:

ERROR, OK, ALSO_OK = range(-1, -1 + 3)

Start and stop values, as well step sizes other than one, are easily
achieved. Skipping values is done like this:

ERROR, OK, ALSO_OK, SOMEOTHER, LAST = range(-1, -1 + 3) + range(1000, 1000 + 2)

Updating the range(s) appropriately when adding/changing values is
easy enough. I find this solution to be good enough since the values
are not ever meant to be used explicitly, and values are not
added/changed often.

This is perhaps not very pretty or concise, but it's simple and it
only uses Python syntax and constructs which everyone knows and
understands. Readability and simplicity are usually my top concerns,
so this fits the bill.

- Tal


From mark at qtrac.eu  Wed Jan 23 16:12:41 2008
From: mark at qtrac.eu (Mark Summerfield)
Date: Wed, 23 Jan 2008 15:12:41 +0000
Subject: [Python-ideas] adopt an enum type for the standard library?
In-Reply-To: <7afdee2f0801230636sf91ebbk1896bd693db4e0bc@mail.gmail.com>
References: <200801230859.54917.mark@qtrac.eu> <fn7eo0$t2n$1@ger.gmane.org>
	<7afdee2f0801230636sf91ebbk1896bd693db4e0bc@mail.gmail.com>
Message-ID: <200801231512.41281.mark@qtrac.eu>

On 2008-01-23, Tal Einat wrote:
> Christian Heimes wrote:
> > Mark Summerfield wrote:
> > > There is an enum module in PyPI
[snip]
>
> -1 on adding a specific construct for enums.
>
>
> What I usually do in Python is this:
>
> ERROR, OK, ALSO_OK = range(-1, -1 + 3)
>
> Start and stop values, as well step sizes other than one, are easily
> achieved. Skipping values is done like this:
>
> ERROR, OK, ALSO_OK, SOMEOTHER, LAST = range(-1, -1 + 3) + range(1000, 1000
> + 2)
>
> Updating the range(s) appropriately when adding/changing values is
> easy enough. I find this solution to be good enough since the values
> are not ever meant to be used explicitly, and values are not
> added/changed often.
>
> This is perhaps not very pretty or concise, but it's simple and it
> only uses Python syntax and constructs which everyone knows and
> understands. Readability and simplicity are usually my top concerns,
> so this fits the bill.

Unfortunately, the "const-ness" of enums defined this way is merely
conventional, so it is easy to change the value of one by mistake.

Using namedtuples means that if you try to assign to the thing you at
least get an exception raised. Not that I'm particularly in favour of
using namedtuples for this purpose, but they are the only "convenient"
way to have enums based on the standard library that I know of.

Although I think that enums are a fundamental (as are sets, and yet it
took many years for them to arrive in the standard library), at least
putting enums in the standard library would allow those who want them to
use them out of the box without impinging on those who are happy with
not having them.

As for which enum implementation, whether the one from PyPI or one from
the cookbook, or another one entirely, I have no strong feelings.

-- 
Mark Summerfield, Qtrac Ltd., www.qtrac.eu



From eduardo.padoan at gmail.com  Wed Jan 23 16:16:40 2008
From: eduardo.padoan at gmail.com (Eduardo O. Padoan)
Date: Wed, 23 Jan 2008 13:16:40 -0200
Subject: [Python-ideas] adopt an enum type for the standard library?
In-Reply-To: <200801230859.54917.mark@qtrac.eu>
References: <200801230859.54917.mark@qtrac.eu>
Message-ID: <dea92f560801230716k197c5718t7a2593a8bae634ee@mail.gmail.com>

On Jan 23, 2008 6:59 AM, Mark Summerfield <mark at qtrac.eu> wrote:
> Wouldn't one of these be worth adopting for the standard library?

This was rejected before:
http://www.python.org/dev/peps/pep-0354/

-- 
http://www.advogato.org/person/eopadoan/
Bookmarks: http://del.icio.us/edcrypt


From guido at python.org  Wed Jan 23 16:19:25 2008
From: guido at python.org (Guido van Rossum)
Date: Wed, 23 Jan 2008 07:19:25 -0800
Subject: [Python-ideas] An easier syntax for writing decorators (&
	similar things)?
In-Reply-To: <20080123094421.BFB421E401E@bag.python.org>
References: <20080123094421.BFB421E401E@bag.python.org>
Message-ID: <ca471dc20801230719i43a58969h9a99a2ec6ce3d8c3@mail.gmail.com>

I'm wondering, will Aaron ever realize that no-one understands his posts?

On Jan 23, 2008 1:44 AM, Aaron Brady <castironpi at comcast.net> wrote:
> > -----Original Message-----
> > On 8 Oct 2007, at 10:57, Arnaud Delobelle wrote:
> > >
> > > On Mon, October 8, 2007 4:33 am, Adam Atlas wrote:
> > >> When writing decorators especially when it's one that needs arguments
> > >> other than the function to be wrapped, it often gets rather ugly...
> > [...]
> > > Whhy not create a (meta-)decorator to do this? Something like:
> > [...]
>
> Following up post from 10/8/07.
>
> > To follow up on my untested suggestion, here's one that is tested:
> >
> > # This metadecorator hasn't changed
> >
> > def decorator_withargs(decf):
> >      def decorator(*args, **kwargs):
> >          def decorated(f):
> >              return decf(f, *args, **kwargs)
> >          return decorated
> >      return decorator
>
> This is equivalent to:
> (1)
>       decorator_withargs= partial( partial, prepartial )
>
> , where prepartial is roughly the same as partial as you might expect:
>
> (2)
> 1     def prepartial(func, *args, **keywords):
> 2           def newfunc(*fargs, **fkeywords):
> 3               newkeywords = keywords.copy()
> 4               newkeywords.update(fkeywords)
> 5               return func(*(fargs+ args), **newkeywords)
> 6           newfunc.func = func
> 7           newfunc.args = args
> 8           newfunc.keywords = keywords
> 9           return newfunc
>
> Partial is the same outside of line 5:
>
> (3)
> 5               return func(*(args + fargs), **newkeywords)
>
> Results are the same:
>
> -> f
> 1
> f -> 2
> 2
>
> Intriguing.
>
> > # Here's how to use it to create a decorator
> >
> > @decorator_withargs
> > def mydec(f, before='entering %s', after='%s returns %%s'):
> >      before = before % f.__name__
> >      after = after % f.__name__
> >      def decorated(*args, **kwargs):
> >          print before
> >          result = f(*args, **kwargs)
> >          print after % result
> >          return result
> >      return decorated
> >
> >
> > # Now I can decorate a function with my new decorator
> >
> > @mydec(before='-> %s', after='%s -> %%s')
> > def f(x):
> >      print x
> >      return x+1
> >
> >
> > Then
> >
> >  >>> f(1)
> > -> f
> > 1
> > f -> 2
> > 2
> >
> > --
> > Arnaud
>
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From mark at qtrac.eu  Wed Jan 23 16:51:35 2008
From: mark at qtrac.eu (Mark Summerfield)
Date: Wed, 23 Jan 2008 15:51:35 +0000
Subject: [Python-ideas] adopt an enum type for the standard library?
In-Reply-To: <dea92f560801230716k197c5718t7a2593a8bae634ee@mail.gmail.com>
References: <200801230859.54917.mark@qtrac.eu>
	<dea92f560801230716k197c5718t7a2593a8bae634ee@mail.gmail.com>
Message-ID: <200801231551.35451.mark@qtrac.eu>

On 2008-01-23, Eduardo O. Padoan wrote:
> On Jan 23, 2008 6:59 AM, Mark Summerfield <mark at qtrac.eu> wrote:
> > Wouldn't one of these be worth adopting for the standard library?
>
> This was rejected before:
> http://www.python.org/dev/peps/pep-0354/

So were set comprehensions, but they're in now...

-- 
Mark Summerfield, Qtrac Ltd., www.qtrac.eu



From castironpi at comcast.net  Wed Jan 23 19:08:14 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Wed, 23 Jan 2008 12:08:14 -0600
Subject: [Python-ideas] An easier syntax for writing decorators
	(&similar things)?
In-Reply-To: <ca471dc20801230719i43a58969h9a99a2ec6ce3d8c3@mail.gmail.com>
Message-ID: <20080123180829.E40011E400F@bag.python.org>

This is like totally. cool.  Like Mr. Delobelle's post on the decorator's
and stuff was like, a simplified version of this: 

decorator_withargs= partial( partial, prepartial )

Which is way cool, and I couldn't even understand, and like, wrap my brain
around and stuff.  Do uuuu understand?  It?

> -----Original Message-----
> From: python-ideas-bounces at python.org [mailto:python-ideas-
> bounces at python.org] On Behalf Of Guido van Rossum
> Sent: Wednesday, January 23, 2008 9:19 AM
> To: python-ideas at python.org
> Subject: Re: [Python-ideas] An easier syntax for writing decorators
> (&similar things)?
> 
> I'm wondering, will Aaron ever realize that no-one understands his posts?
> 
> On Jan 23, 2008 1:44 AM, Aaron Brady <castironpi at comcast.net> wrote:
> > > -----Original Message-----
> > > On 8 Oct 2007, at 10:57, Arnaud Delobelle wrote:
> > > >
> > > > On Mon, October 8, 2007 4:33 am, Adam Atlas wrote:
> > > >> When writing decorators especially when it's one that needs
> arguments
> > > >> other than the function to be wrapped, it often gets rather ugly...
> > > [...]
> > > > Whhy not create a (meta-)decorator to do this? Something like:
> > > [...]
> >
> > Following up post from 10/8/07.
> >
> > > To follow up on my untested suggestion, here's one that is tested:
> > >
> > > # This metadecorator hasn't changed
> > >
> > > def decorator_withargs(decf):
> > >      def decorator(*args, **kwargs):
> > >          def decorated(f):
> > >              return decf(f, *args, **kwargs)
> > >          return decorated
> > >      return decorator
> >
> > This is equivalent to:
> > (1)
> >       decorator_withargs= partial( partial, prepartial )
> >
> > , where prepartial is roughly the same as partial as you might expect:
> >
> > (2)
> > 1     def prepartial(func, *args, **keywords):
> > 2           def newfunc(*fargs, **fkeywords):
> > 3               newkeywords = keywords.copy()
> > 4               newkeywords.update(fkeywords)
> > 5               return func(*(fargs+ args), **newkeywords)
> > 6           newfunc.func = func
> > 7           newfunc.args = args
> > 8           newfunc.keywords = keywords
> > 9           return newfunc
> >
> > Partial is the same outside of line 5:
> >
> > (3)
> > 5               return func(*(args + fargs), **newkeywords)
> >
> > Results are the same:
> >
> > -> f
> > 1
> > f -> 2
> > 2
> >
> > Intriguing.
> >
> > > # Here's how to use it to create a decorator
> > >
> > > @decorator_withargs
> > > def mydec(f, before='entering %s', after='%s returns %%s'):
> > >      before = before % f.__name__
> > >      after = after % f.__name__
> > >      def decorated(*args, **kwargs):
> > >          print before
> > >          result = f(*args, **kwargs)
> > >          print after % result
> > >          return result
> > >      return decorated
> > >
> > >
> > > # Now I can decorate a function with my new decorator
> > >
> > > @mydec(before='-> %s', after='%s -> %%s')
> > > def f(x):
> > >      print x
> > >      return x+1
> > >
> > >
> > > Then
> > >
> > >  >>> f(1)
> > > -> f
> > > 1
> > > f -> 2
> > > 2
> > >
> > > --
> > > Arnaud
> >
> > _______________________________________________
> > Python-ideas mailing list
> > Python-ideas at python.org
> > http://mail.python.org/mailman/listinfo/python-ideas
> >
> 
> 
> 
> --
> --Guido van Rossum (home page: http://www.python.org/~guido/)
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas



From arno at marooned.org.uk  Wed Jan 23 19:42:42 2008
From: arno at marooned.org.uk (Arnaud Delobelle)
Date: Wed, 23 Jan 2008 18:42:42 +0000
Subject: [Python-ideas] An easier syntax for writing decorators (&
	similar things)?
In-Reply-To: <20080123094421.BFB421E401E@bag.python.org>
References: <20080123094421.BFB421E401E@bag.python.org>
Message-ID: <3B602BAC-4865-4724-9DFD-254384E58680@marooned.org.uk>


On 23 Jan 2008, at 09:44, Aaron Brady wrote:

>> -----Original Message-----
>> On 8 Oct 2007, at 10:57, Arnaud Delobelle wrote:
>>>
>>> On Mon, October 8, 2007 4:33 am, Adam Atlas wrote:
>>>> When writing decorators especially when it's one that needs  
>>>> arguments
>>>> other than the function to be wrapped, it often gets rather ugly...
>> [...]
>>> Whhy not create a (meta-)decorator to do this? Something like:
>> [...]
>
> Following up post from 10/8/07.
>
>> To follow up on my untested suggestion, here's one that is tested:
>>
>> # This metadecorator hasn't changed
>>
>> def decorator_withargs(decf):
>>    def decorator(*args, **kwargs):
>>        def decorated(f):
>>            return decf(f, *args, **kwargs)
>>        return decorated
>>    return decorator
>
> This is equivalent to:
> (1)
>     decorator_withargs= partial( partial, prepartial )

[where partial is as in functools and
prepartial(f, x, y)(z, t) <=> f(z, t, x, y)]

Indeed, and if one restricts decorator_withargs to keyword arguments,  
one can simply define it as:

decorator_withargs = partial(partial, partial)

Which is the curry operator!  So decorator_withargs is some sort of  
curry. In fact I had never realised before that this was a way to  
define curry (for functions with 2 arguments)

curry = partial(partial, partial)

[if f is a two-arguments function then curry(f)(x)(y) is f(x, y)]

This means that a meta-decorator (i.e. a decorator for decorators) is  
a kind of currying operator.

> Intriguing.

!

-- 
Arnaud




From castironpi at comcast.net  Wed Jan 23 19:54:22 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Wed, 23 Jan 2008 12:54:22 -0600
Subject: [Python-ideas] An easier syntax for writing decorators
	(&similar things)?
In-Reply-To: <3B602BAC-4865-4724-9DFD-254384E58680@marooned.org.uk>
Message-ID: <20080123185427.813FD1E400F@bag.python.org>

> >>>> When writing decorators especially when it's one that needs
> >>>> arguments
> >>>> other than the function to be wrapped, it often gets rather ugly...
> >> [...]
> >>> Whhy not create a (meta-)decorator to do this? Something like:
> >> [...]
> >
> > Following up post from 10/8/07.
> >
> >> To follow up on my untested suggestion, here's one that is tested:
> >>
> >> # This metadecorator hasn't changed
> >>
> >> def decorator_withargs(decf):
> >>    def decorator(*args, **kwargs):
> >>        def decorated(f):
> >>            return decf(f, *args, **kwargs)
> >>        return decorated
> >>    return decorator
> >
> > This is equivalent to:
> > (1)
> >     decorator_withargs= partial( partial, prepartial )
> 
> [where partial is as in functools and
> prepartial(f, x, y)(z, t) <=> f(z, t, x, y)]
> 
> Indeed, and if one restricts decorator_withargs to keyword arguments,
> one can simply define it as:
> 
> decorator_withargs = partial(partial, partial)

I believe you can put -f- in the last pos'l argument and have this work.

def mydec( arg0, arg1, f, before='entering %s', after='%s returns %%s').

> Which is the curry operator!  So decorator_withargs is some sort of
> curry. In fact I had never realised before that this was a way to
> define curry (for functions with 2 arguments)
> 
> curry = partial(partial, partial)
> 
> [if f is a two-arguments function then curry(f)(x)(y) is f(x, y)]
> 
> This means that a meta-decorator (i.e. a decorator for decorators) is
> a kind of currying operator.
> 
> > Intriguing.
> 
> !



From taleinat at gmail.com  Wed Jan 23 20:14:54 2008
From: taleinat at gmail.com (Tal Einat)
Date: Wed, 23 Jan 2008 21:14:54 +0200
Subject: [Python-ideas] adopt an enum type for the standard library?
In-Reply-To: <200801231512.41281.mark@qtrac.eu>
References: <200801230859.54917.mark@qtrac.eu> <fn7eo0$t2n$1@ger.gmane.org>
	<7afdee2f0801230636sf91ebbk1896bd693db4e0bc@mail.gmail.com>
	<200801231512.41281.mark@qtrac.eu>
Message-ID: <7afdee2f0801231114t1b4f5464m6f02f7fc90efa37d@mail.gmail.com>

Mark Summerfield wrote:
> On 2008-01-23, Tal Einat wrote:
> >
> > -1 on adding a specific construct for enums.
> >
> >
> > What I usually do in Python is this:
> >
> > ERROR, OK, ALSO_OK = range(-1, -1 + 3)
> >
> > Start and stop values, as well step sizes other than one, are easily
> > achieved. Skipping values is done like this:
> >
> > ERROR, OK, ALSO_OK, SOMEOTHER, LAST = range(-1, -1 + 3) + range(1000, 1000
> > + 2)
> >
> > Updating the range(s) appropriately when adding/changing values is
> > easy enough. I find this solution to be good enough since the values
> > are not ever meant to be used explicitly, and values are not
> > added/changed often.
> >
> > This is perhaps not very pretty or concise, but it's simple and it
> > only uses Python syntax and constructs which everyone knows and
> > understands. Readability and simplicity are usually my top concerns,
> > so this fits the bill.
>
> Unfortunately, the "const-ness" of enums defined this way is merely
> conventional, so it is easy to change the value of one by mistake.
>
> Using namedtuples means that if you try to assign to the thing you at
> least get an exception raised. Not that I'm particularly in favour of
> using namedtuples for this purpose, but they are the only "convenient"
> way to have enums based on the standard library that I know of.

(You mean that namedtuples are the only "convenient" way to have
_"const"_ enums, right?)

Well, not many things in Python are "const" at all. I didn't realize
this ("const-ness") was a criterion. So, just to be clear, you want a
construct which allows setting sequential numerical values to names
(variables or otherwise), which become "const" from that point
onwards. Is this correct?

If that's the case, then I'm still -1. The reason is that, IMHO and
AFAIK, having "const" things is un-Pythonic. I think it -great- that I
can override anything in Python, even if the original author of the
code didn't imagine a reason one would want to do so.

- Tal


From wrobell at pld-linux.org  Thu Jan 24 02:21:22 2008
From: wrobell at pld-linux.org (wrobell)
Date: Thu, 24 Jan 2008 01:21:22 +0000
Subject: [Python-ideas] adopt an enum type for the standard library?
In-Reply-To: <7afdee2f0801230636sf91ebbk1896bd693db4e0bc@mail.gmail.com>
References: <200801230859.54917.mark@qtrac.eu> <fn7eo0$t2n$1@ger.gmane.org>
	<7afdee2f0801230636sf91ebbk1896bd693db4e0bc@mail.gmail.com>
Message-ID: <20080124012122.GE7629@borg>

On Wed, Jan 23, 2008 at 04:36:26PM +0200, Tal Einat wrote:
[...]
> What I usually do in Python is this:
> 
> ERROR, OK, ALSO_OK = range(-1, -1 + 3)
> 
> Start and stop values, as well step sizes other than one, are easily
> achieved. Skipping values is done like this:
> 
> ERROR, OK, ALSO_OK, SOMEOTHER, LAST = range(-1, -1 + 3) + range(1000, 1000 + 2)

are you able to decode that at 4am? :)

[...]

regards,

    wrobell <wrobell at pld-linux.org>


From castironpi at comcast.net  Thu Jan 24 03:06:19 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Wed, 23 Jan 2008 20:06:19 -0600
Subject: [Python-ideas] An easier syntax for writing decorators(&similar
	things)?
In-Reply-To: <20080123185427.813FD1E400F@bag.python.org>
Message-ID: <20080124020623.A4DB21E400F@bag.python.org>

> > >>>> When writing decorators especially when it's one that needs
> > >>>> arguments
> > >>>> other than the function to be wrapped, it often gets rather ugly...
> > >> [...]
> > >>> Whhy not create a (meta-)decorator to do this? Something like:
> > >> [...]
> > >
> > >> To follow up on my untested suggestion, here's one that is tested:
> > >>
> > >> # This metadecorator hasn't changed
> > >>
> > >> def decorator_withargs(decf):
> > >>    def decorator(*args, **kwargs):
> > >>        def decorated(f):
> > >>            return decf(f, *args, **kwargs)
> > >>        return decorated
> > >>    return decorator
> > >
> > > This is equivalent to:
> > > (1)
> > >     decorator_withargs= partial( partial, prepartial )
> >
> > [where partial is as in functools and
> > prepartial(f, x, y)(z, t) <=> f(z, t, x, y)]
> >
> > Indeed, and if one restricts decorator_withargs to keyword arguments,
> > one can simply define it as:
> >
> > decorator_withargs = partial(partial, partial)
> 
> I believe you can put -f- in the last pos'l argument and have this work.
> 
> def mydec( arg0, arg1, f, before='entering %s', after='%s returns %%s').
> 
> > Which is the curry operator!  So decorator_withargs is some sort of
> > curry. In fact I had never realised before that this was a way to
> > define curry (for functions with 2 arguments)
> >
> > curry = partial(partial, partial)
> >
> > [if f is a two-arguments function then curry(f)(x)(y) is f(x, y)]
> >
> > This means that a meta-decorator (i.e. a decorator for decorators) is
> > a kind of currying operator.
> >
> > > Intriguing.
> >
> > !

	def f( callback, *bar, **bkwar ):
		def preg ( callfore, *far, **fkwar ):
			sf= g( callback, callfore, bar, bkwar, far, fkwar )
			return sf
		return preg

We see how to rewrite this one?



From steven.bethard at gmail.com  Thu Jan 24 03:12:03 2008
From: steven.bethard at gmail.com (Steven Bethard)
Date: Wed, 23 Jan 2008 19:12:03 -0700
Subject: [Python-ideas] An easier syntax for writing decorators(&similar
	things)?
In-Reply-To: <20080124020623.A4DB21E400F@bag.python.org>
References: <20080123185427.813FD1E400F@bag.python.org>
	<20080124020623.A4DB21E400F@bag.python.org>
Message-ID: <d11dcfba0801231812j968211cxb4e0834691dc5e31@mail.gmail.com>

On Jan 23, 2008 7:06 PM, Aaron Brady <castironpi at comcast.net> wrote:
>         def f( callback, *bar, **bkwar ):
>                 def preg ( callfore, *far, **fkwar ):
>                         sf= g( callback, callfore, bar, bkwar, far, fkwar )
>                         return sf
>                 return preg
>
> We see how to rewrite this one?

The Python Ideas list is really intended as sort of a testing ground
for potential PEPs. What is it that you're proposing to add to or
remove from the language?

If you're just enjoying writing code in Python, comp.lang.python is a
more appropriate place to post.

STeVe
-- 
I'm not *in*-sane. Indeed, I am so far *out* of sane that you appear a
tiny blip on the distant coast of sanity.
        --- Bucky Katt, Get Fuzzy


From castironpi at comcast.net  Thu Jan 24 06:55:56 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Wed, 23 Jan 2008 23:55:56 -0600
Subject: [Python-ideas] adopt an enum type for the standard library?
In-Reply-To: <20080124012122.GE7629@borg>
Message-ID: <20080124055600.5F2AE1E4025@bag.python.org>

> > What I usually do in Python is this:
> >
> > ERROR, OK, ALSO_OK = range(-1, -1 + 3)
> >
> > Start and stop values, as well step sizes other than one, are easily
> > achieved. Skipping values is done like this:
> >
> > ERROR, OK, ALSO_OK, SOMEOTHER, LAST = range(-1, -1 + 3) + range(1000,
> 1000 + 2)

Crazy:

	wheres= Enum()
	wheres.UP
	wheres.DOWN
	wheres.LEFT
	wheres.close()
	print UP, DOWN, LEFT, RIGHT
0 1 2 3

Crazier:
	wheres= Enum()
	while wheres.open:
		with wheres:
			UP, DOWN, LEFT, RIGHT
	print UP, DOWN, LEFT, RIGHT
0 1 2 3



From castironpi at comcast.net  Thu Jan 24 06:55:56 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Wed, 23 Jan 2008 23:55:56 -0600
Subject: [Python-ideas] An easier syntax for writing decorators(&similar
	things)?
In-Reply-To: <d11dcfba0801231812j968211cxb4e0834691dc5e31@mail.gmail.com>
Message-ID: <20080124055606.788991E4011@bag.python.org>

> >         def f( callback, *bar, **bkwar ):
> >                 def preg ( callfore, *far, **fkwar ):
> >                         sf= g( callback, callfore, bar, bkwar, far,
> fkwar )
> >                         return sf
> >                 return preg
> >
> > We see how to rewrite this one?
> 
> The Python Ideas list is really intended as sort of a testing ground
> for potential PEPs. What is it that you're proposing to add to or
> remove from the language?

+1 on prepartial in functools.



From castironpi at comcast.net  Thu Jan 24 07:11:09 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Thu, 24 Jan 2008 00:11:09 -0600
Subject: [Python-ideas] adopt an enum type for the standard library?
In-Reply-To: <20080124055600.5F2AE1E4025@bag.python.org>
Message-ID: <20080124061112.A3B931E4011@bag.python.org>

> > > What I usually do in Python is this:
> > >
> > > ERROR, OK, ALSO_OK = range(-1, -1 + 3)
> > >
> > > Start and stop values, as well step sizes other than one, are easily
> > > achieved. Skipping values is done like this:
> > >
> > > ERROR, OK, ALSO_OK, SOMEOTHER, LAST = range(-1, -1 + 3) + range(1000,
> > 1000 + 2)

And yes, there's also:

	wheres= Enum( UP= 0, DOWN= 0, LEFT= 0, RIGHT= 0 )
	print UP, DOWN, LEFT, RIGHT
	print wheres.UP, wheres.DOWN, wheres.LEFT, wheres.RIGHT
2 0 3 1
2 0 3 1

and:

	wheres= Enum( 0, 'UP DOWN LEFT RIGHT'.split() )
	print UP, DOWN, LEFT, RIGHT
	print wheres.UP, wheres.DOWN, wheres.LEFT, wheres.RIGHT
0 1 2 3
0 1 2 3



From steven.bethard at gmail.com  Thu Jan 24 07:15:43 2008
From: steven.bethard at gmail.com (Steven Bethard)
Date: Wed, 23 Jan 2008 23:15:43 -0700
Subject: [Python-ideas] An easier syntax for writing decorators(&similar
	things)?
In-Reply-To: <20080124055606.788991E4011@bag.python.org>
References: <d11dcfba0801231812j968211cxb4e0834691dc5e31@mail.gmail.com>
	<20080124055606.788991E4011@bag.python.org>
Message-ID: <d11dcfba0801232215v5efdb2dam9f2d74eb11099788@mail.gmail.com>

On Jan 23, 2008 10:55 PM, Aaron Brady <castironpi at comcast.net> wrote:
> > >         def f( callback, *bar, **bkwar ):
> > >                 def preg ( callfore, *far, **fkwar ):
> > >                         sf= g( callback, callfore, bar, bkwar, far,
> > fkwar )
> > >                         return sf
> > >                 return preg
> > >
> > > We see how to rewrite this one?
> >
> > The Python Ideas list is really intended as sort of a testing ground
> > for potential PEPs. What is it that you're proposing to add to or
> > remove from the language?
>
> +1 on prepartial in functools.

So far, I've only seen this one use case for prepartial - and I'd
rather have decorator_withargs itself than prepartial.  Do you have
any other use cases for prepartial (from real code somewhere)?

Currently, I'm -1 on adding prepartial to functools, and +0.5 on
adding something like decorator_withargs.

STeVe
-- 
I'm not *in*-sane. Indeed, I am so far *out* of sane that you appear a
tiny blip on the distant coast of sanity.
        --- Bucky Katt, Get Fuzzy


From castironpi at comcast.net  Thu Jan 24 07:20:50 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Thu, 24 Jan 2008 00:20:50 -0600
Subject: [Python-ideas] An easier syntax for writing decorators(&similar
	things)?
In-Reply-To: <d11dcfba0801232215v5efdb2dam9f2d74eb11099788@mail.gmail.com>
Message-ID: <20080124062053.6092D1E400F@bag.python.org>

> > > >         def f( callback, *bar, **bkwar ):
> > > >                 def preg ( callfore, *far, **fkwar ):
> > > >                         sf= g( callback, callfore, bar, bkwar, far,
> > > fkwar )
> > > >                         return sf
> > > >                 return preg
> > > >
> > > > We see how to rewrite this one?
> > >
> > > The Python Ideas list is really intended as sort of a testing ground
> > > for potential PEPs. What is it that you're proposing to add to or
> > > remove from the language?
> >
> > +1 on prepartial in functools.
> 
> So far, I've only seen this one use case for prepartial - and I'd
> rather have decorator_withargs itself than prepartial.  Do you have
> any other use cases for prepartial (from real code somewhere)?
> 
> Currently, I'm -1 on adding prepartial to functools, and +0.5 on
> adding something like decorator_withargs.

Are you also -1 on partial's being in functools?



From mark at qtrac.eu  Thu Jan 24 09:24:28 2008
From: mark at qtrac.eu (Mark Summerfield)
Date: Thu, 24 Jan 2008 08:24:28 +0000
Subject: [Python-ideas] adopt an enum type for the standard library?
In-Reply-To: <7afdee2f0801231114t1b4f5464m6f02f7fc90efa37d@mail.gmail.com>
References: <200801230859.54917.mark@qtrac.eu>
	<200801231512.41281.mark@qtrac.eu>
	<7afdee2f0801231114t1b4f5464m6f02f7fc90efa37d@mail.gmail.com>
Message-ID: <200801240824.29041.mark@qtrac.eu>

On 2008-01-23, Tal Einat wrote:
> Mark Summerfield wrote:
> > On 2008-01-23, Tal Einat wrote:
> > > -1 on adding a specific construct for enums.
> > >
> > >
> > > What I usually do in Python is this:
> > >
> > > ERROR, OK, ALSO_OK = range(-1, -1 + 3)
> > >
> > > Start and stop values, as well step sizes other than one, are easily
> > > achieved. Skipping values is done like this:
> > >
> > > ERROR, OK, ALSO_OK, SOMEOTHER, LAST = range(-1, -1 + 3) + range(1000,
> > > 1000 + 2)
> > >
> > > Updating the range(s) appropriately when adding/changing values is
> > > easy enough. I find this solution to be good enough since the values
> > > are not ever meant to be used explicitly, and values are not
> > > added/changed often.
> > >
> > > This is perhaps not very pretty or concise, but it's simple and it
> > > only uses Python syntax and constructs which everyone knows and
> > > understands. Readability and simplicity are usually my top concerns,
> > > so this fits the bill.
> >
> > Unfortunately, the "const-ness" of enums defined this way is merely
> > conventional, so it is easy to change the value of one by mistake.
> >
> > Using namedtuples means that if you try to assign to the thing you at
> > least get an exception raised. Not that I'm particularly in favour of
> > using namedtuples for this purpose, but they are the only "convenient"
> > way to have enums based on the standard library that I know of.
>
> (You mean that namedtuples are the only "convenient" way to have
> _"const"_ enums, right?)
>
> Well, not many things in Python are "const" at all. I didn't realize
> this ("const-ness") was a criterion. So, just to be clear, you want a
> construct which allows setting sequential numerical values to names
> (variables or otherwise), which become "const" from that point
> onwards. Is this correct?

This is the kind of thing I'm looking for:

    flags = Enum("OK", "ERROR", "OTHER") # defaults to sequential ints
    flags.OK == 0
    flags.ERROR == 1
    flags.OTHER == 2
    flags.OK = 5 # exception raised
    flags.FOO # exception raised
    str(flags.OK) == "OK"

    for flag in flags: # iterates by name
	number = flags[flag]

    flags[2] == "OTHER"
    flags["ERROR"] == 1
    flags[99] # exception raised
    flags["FOO"] # exception raised

other syntaxes:

    flags = Enum(OK=1, ERROR=-5, OTHER=100)
    flags = Enum((("OK", 1), ("OTHER", 100), ("ERROR", -5))) # usful with 
zip()

It doesn't matter if someone can figure out a way to change an enum
value, so long as assignment doesn't work, to avoid accidental changes.

Could I implement this myself? Yes. So, could any Python programmer.

But that's no use---right now there is no _standard_ way to have enums
in Python. So you either download the PyPI enum package or use one of
those from the Python Cookbook, or write your own. In other words, for
enums we have the Perl-like "there's more than one way to do it". Oh,
and there's one other way, using the standard library:

    flags = collections.namedtuple("Enum", "OK ERROR OTHER")(0, 1, 2)

I think enums are a common enough requirement to be worth adding to the
standard library---not to the language.

-- 
Mark Summerfield, Qtrac Ltd., www.qtrac.eu



From python at rcn.com  Thu Jan 24 10:02:50 2008
From: python at rcn.com (Raymond Hettinger)
Date: Thu, 24 Jan 2008 01:02:50 -0800
Subject: [Python-ideas] adopt an enum type for the standard library?
Message-ID: <002501c85e67$e62f4880$6800a8c0@RaymondLaptop1>

[Mark Summerfeld]
> I think enums are a common enough requirement to be worth
>adding to the standard library---not to the language.

-1  This is clutter.

Also, I disagree about there being a need.  We've long been able to roll our own with a simple class definition (like the first 
example shown below), but people rarely do that.  They don't want to pay the cost in speed (for the attribute lookup) and find that 
all the prefixed references make their code look more Javaesque that Pythonesque.

Over years of looking at tons of Python code, I've seen several variants of enums.  In *every* case, they were just a toy recipe and 
were not be used in anything other than demo code.  I know it's possible to invent use cases for this, but I think real world needs 
have largely been met through module and class namespaces.  If you have a lot of constants, the simple approach is to put them in 
their own module and then reference them using mod.name.  If there are a medium number, then a class namespace (like the first 
example below) may be preferable.  If there are only a few, the people seem to be happy with globals.

Something like "wheres= Enum( UP= 0, DOWN= 0, LEFT= 0, RIGHT= 0 )" is easily written as:

    class wheres:
        UP = 0; DOWN = 0; LEFT = 0; RIGHT = 0

Or equivalently:

       wheres = type('wheres', (), dict(UP= 0, DOWN= 0, LEFT= 0, RIGHT= 0 ))

Or using the module approach:

     -- wheres.py --
     UP = 0; DOWN = 0; LEFT = 0; RIGHT = 0

      --- mymod.py --
      import wheres
      prints where.UP

Or using globals approach and avoid the attribute lookup:

      UP = 0; DOWN = 0; LEFT = 0; RIGHT = 0

There are already so many reasonable ways to do this or avoid doing it, that it would be silly to add an Enum factory.

IMO, Enum() is in the category of recipes that are like variants of flatten() or Sudoku solvers; they are fun to write but no one 
really needs them.

not-everything-in-the-cookbook-needs-to-be-in-the-library-ly yours,


Raymond 


From mark at qtrac.eu  Thu Jan 24 11:04:06 2008
From: mark at qtrac.eu (Mark Summerfield)
Date: Thu, 24 Jan 2008 10:04:06 +0000
Subject: [Python-ideas] adopt an enum type for the standard library?
In-Reply-To: <002501c85e67$e62f4880$6800a8c0@RaymondLaptop1>
References: <002501c85e67$e62f4880$6800a8c0@RaymondLaptop1>
Message-ID: <200801241004.07048.mark@qtrac.eu>

On 2008-01-24, Raymond Hettinger wrote:
> [Mark Summerfeld]
>
> > I think enums are a common enough requirement to be worth
> >adding to the standard library---not to the language.
>
> -1  This is clutter.
>
> Also, I disagree about there being a need.  We've long been able to roll
> our own with a simple class definition (like the first example shown
> below), but people rarely do that.  They don't want to pay the cost in
> speed (for the attribute lookup) and find that all the prefixed references
> make their code look more Javaesque that Pythonesque.
>
> Over years of looking at tons of Python code, I've seen several variants of
> enums.  In *every* case, they were just a toy recipe and were not be used
> in anything other than demo code.

Did you also see lots of examples of people creating their own set
types? Until something is part of the language or library people will
toy with their own version but probably won't commit to it. But once
something is standardised it gets used---if it is general purpose. For
example, I use sets all the time now, but they are a relatively new
feature in Python, so before I often had dicts with None values.

[snipped examples]

> There are already so many reasonable ways to do this or avoid doing it,
> that it would be silly to add an Enum factory.

I thought that one of Python's strengths was supposed to be that in
general there is just one best way to do things.

> IMO, Enum() is in the category of recipes that are like variants of
> flatten() or Sudoku solvers; they are fun to write but no one really needs
> them.

None of the examples you gave provides any level of const-ness. Yet
namedtuple's and tuples do.

And an enum factory function for collections could be as simple as:

def enum(field_names, values=None):
    e = collections.namedtuple("_enum", field_names)
    if values is None:
        values = range(len(field_names.split()))
    return e(*values)

(Although this doesn't account for pickling.)

>>> Vehicle = enum("car van bus truck")
>>> Vehicle.car, Vehicle.van, Vehicle.bus, Vehicle.truck
(0, 1, 2, 3)
>>> Limit = enum("minimum maximum default", (-50, 50, 10))
>>> Limit.default
10

-- 
Mark Summerfield, Qtrac Ltd., www.qtrac.eu



From junk at themilkyway.com  Thu Jan 24 11:03:18 2008
From: junk at themilkyway.com (Jonathan Marshall)
Date: Thu, 24 Jan 2008 10:03:18 +0000
Subject: [Python-ideas] adopt an enum type for the standard library?
In-Reply-To: <fn7eo0$t2n$1@ger.gmane.org>
References: <200801230859.54917.mark@qtrac.eu> <fn7eo0$t2n$1@ger.gmane.org>
Message-ID: <47986266.9050809@themilkyway.com>

Christian Heimes wrote:
> Mark Summerfield wrote:
>   
>> There is an enum module in PyPI 
>> http://pypi.python.org/pypi/enum/
>> and there are several versions in the Python Cookbook.
>>
>> Wouldn't one of these be worth adopting for the standard library?
>>     
>
>
> It might be worth adding an enum to Python 2.6. I'm +0 on it.
>
> The enum implementation from pypi is not sufficient for Python core. I
> don't like its __cmp__ and __hash__ code. I also miss the feature to set
> a start value or to skip values:
>
>   
I'd be +1 on adding an enum type. I chose an enum type from the cookbook 
for our company to use. All was great until 1-2 years later when we 
needed to start persisting objects that contained enums. We found that 
that publically available enums wouldn't cope and we had to invest 
signficant effort in changing our code. E.g. the enum type from pypi:

 >>> from enum import Enum
 >>> import pickle
 >>>
 >>> Colours = Enum('red', 'blue', 'green')
 >>> x = pickle.dumps(Colours.red)
 >>> y = pickle.loads(x)
 >>> print y
red
 >>> assert y == Colours.red
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
AssertionError

Also, we find that experienced python programmers are use to the absence 
of a standard enum type but new python programmers are surprised by its 
absence - it's not 'batteries included'.

So I think there would be value in adding an enum to the standard 
library that does the job correctly.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20080124/332cbbd9/attachment.html>

From steven.bethard at gmail.com  Thu Jan 24 17:44:15 2008
From: steven.bethard at gmail.com (Steven Bethard)
Date: Thu, 24 Jan 2008 09:44:15 -0700
Subject: [Python-ideas] An easier syntax for writing decorators(&similar
	things)?
In-Reply-To: <20080124062053.6092D1E400F@bag.python.org>
References: <d11dcfba0801232215v5efdb2dam9f2d74eb11099788@mail.gmail.com>
	<20080124062053.6092D1E400F@bag.python.org>
Message-ID: <d11dcfba0801240844q474b4df6yd918bbdd4b3db689@mail.gmail.com>

On Jan 23, 2008 11:20 PM, Aaron Brady <castironpi at comcast.net> wrote:
> > > > >         def f( callback, *bar, **bkwar ):
> > > > >                 def preg ( callfore, *far, **fkwar ):
> > > > >                         sf= g( callback, callfore, bar, bkwar, far,
> > > > fkwar )
> > > > >                         return sf
> > > > >                 return preg
> > > > >
> > > > > We see how to rewrite this one?
> > > >
> > > > The Python Ideas list is really intended as sort of a testing ground
> > > > for potential PEPs. What is it that you're proposing to add to or
> > > > remove from the language?
> > >
> > > +1 on prepartial in functools.
> >
> > So far, I've only seen this one use case for prepartial - and I'd
> > rather have decorator_withargs itself than prepartial.  Do you have
> > any other use cases for prepartial (from real code somewhere)?
> >
> > Currently, I'm -1 on adding prepartial to functools, and +0.5 on
> > adding something like decorator_withargs.
>
> Are you also -1 on partial's being in functools?

Not that it matters, since it's already there, but no, I wasn't.  In
fact, the presence of partial is a big reason not to need prepartial
-- most existing use cases are already covered by using partial. As
Arnaud pointed out, if you restrict decorator_withargs to keyword
arguments, you don't even need prepartial, you can just use partial
itself.  Thus, I don't see prepartial as really covering many new use
cases.  If you'd like to convince me otherwise, you're going to need
to post some use cases from real code.

STeVe
-- 
I'm not *in*-sane. Indeed, I am so far *out* of sane that you appear a
tiny blip on the distant coast of sanity.
        --- Bucky Katt, Get Fuzzy


From python at rcn.com  Thu Jan 24 17:53:21 2008
From: python at rcn.com (Raymond Hettinger)
Date: Thu, 24 Jan 2008 08:53:21 -0800
Subject: [Python-ideas] adopt an enum type for the standard library?
References: <002501c85e67$e62f4880$6800a8c0@RaymondLaptop1>
	<200801241004.07048.mark@qtrac.eu>
Message-ID: <008901c85ea9$a19e76e0$6800a8c0@RaymondLaptop1>

>> Over years of looking at tons of Python code, I've seen several variants of
>> enums.  In *every* case, they were just a toy recipe and were not be used
>> in anything other than demo code.

[Mark Summerfeld]
> Did you also see lots of examples of people creating their own set
> types?

> Until something is part of the language or library people will
> toy with their own version but probably won't commit to it. But once
> something is standardised it gets used---if it is general purpose.

This is true.  Unfortunately, it means that if you add something to
the language that shouldn't be there, it will get used also.  When
a tool is present, it suggests that the language designers have 
thought it through and are recommending it.  It then takes time
and experience to unlearn the habit (for example, it takes a while
to learn that pervasive isinstance() checks get in the way of
duck typing).  This is doubly true for features that are found
in compiled languages but of questionable value in a dynamic
language.

IMO, the hardest thing for a Python newbie is to stop coding
like a Java programmer.



>> There are already so many reasonable ways to do this or avoid doing it,
>> that it would be silly to add an Enum factory.
> 
> I thought that one of Python's strengths was supposed to be that in
> general there is just one best way to do things.

It's strange that you bring that up as argument for adding yet another
type of namespace.


> None of the examples you gave provides any level of const-ness.

That is not a virtue.  It is just psuedo const-ness.  It is dog slow
and not compiled as a constant.  All you're getting is something
that is harder to write to.  That doesn't make sense in a 
consenting adults environment.

Take a look through the standard library at how many times we
make an attribute read-only by using property().  It is just not
our style.

BTW, the rational module needs to undo that choice for
numerator and denominator.  It makes the code slow
for basically zero benefit.

[Jonathan Marshall]
> we find that experienced python programmers are use to the
> absence of a standard enum type but new python programmers
> are surprised by its absence

I think this is a clear indication that new programmers all wrestle
with learning to write Pythonically instead of making shallow
translations of code they would have written in statically compiled
languages. Adding something like Enum() will only reinforce those
habits and slow-down their progress towards becoming a good
python programmer.



Raymond


From python at rcn.com  Thu Jan 24 20:14:54 2008
From: python at rcn.com (Raymond Hettinger)
Date: Thu, 24 Jan 2008 14:14:54 -0500 (EST)
Subject: [Python-ideas] Ellipsis Literal
Message-ID: <20080124141454.AFQ56708@ms19.lnh.mail.rcn.net>

[Raymond Hettinger]
>> I missed the conversation on this one.
>> Was there a use case or a reason to add this?

[Robert Kern]
> It was added for Numeric a long time ago.

The ellipsis syntax has been around for a while.
What is new is the Ellipsis literal in Py3.0.
See snippets below.

Raymond


------------------------------------
Python 2.6a0 (trunk:59985:59987, Jan 15 2008, 12:54:09) [GCC 4.1.1 (Gentoo 4.1.1)] on linux2
>>> ...
  File "<stdin>", line 1
    ...
    ^
SyntaxError: invalid syntax

------------------------------------
Python 3.0a2+ (py3k:60204M, Jan 22 2008, 14:50:12) [GCC 4.1.1 (Gentoo 4.1.1)] on linux2
>>> ...
Ellipsis


From castironpi at comcast.net  Thu Jan 24 21:36:53 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Thu, 24 Jan 2008 14:36:53 -0600
Subject: [Python-ideas] An easier syntax for writing decorators(&similar
	things)?
In-Reply-To: <d11dcfba0801240844q474b4df6yd918bbdd4b3db689@mail.gmail.com>
Message-ID: <20080124203723.C1E0F1E4013@bag.python.org>

> > > > +1 on prepartial in functools.
> > >
> > > So far, I've only seen this one use case for prepartial - and I'd
> > > rather have decorator_withargs itself than prepartial.  Do you have
> > > any other use cases for prepartial (from real code somewhere)?
> > >
> > > Currently, I'm -1 on adding prepartial to functools, and +0.5 on
> > > adding something like decorator_withargs.
> >
> > Are you also -1 on partial's being in functools?
> 
> Not that it matters, since it's already there, but no, I wasn't.

Good.  I was.  It got in.

> In
> fact, the presence of partial is a big reason not to need prepartial

Fallacy.

> -- most existing use cases are already covered by using partial.

Lacks citation.

> As
> Arnaud pointed out, if you restrict decorator_withargs to keyword
> arguments,

You don't.

> Thus, I don't see prepartial as really covering many new use
> cases.

Syllogism:

A -> B
A
_____
B

Thus what again?

> If you'd like to convince me otherwise, you're going to need
> to post some use cases from real code.

curry= partial( partial, prepartial ) [1]

[1] http://www.smlnj.org/




From steven.bethard at gmail.com  Thu Jan 24 21:53:16 2008
From: steven.bethard at gmail.com (Steven Bethard)
Date: Thu, 24 Jan 2008 13:53:16 -0700
Subject: [Python-ideas] An easier syntax for writing decorators(&similar
	things)?
In-Reply-To: <20080124203723.C1E0F1E4013@bag.python.org>
References: <d11dcfba0801240844q474b4df6yd918bbdd4b3db689@mail.gmail.com>
	<20080124203723.C1E0F1E4013@bag.python.org>
Message-ID: <d11dcfba0801241253t23b428e9rcbba9748514c8b35@mail.gmail.com>

Steven Bethard wrote:
[after suggesting that prepartial doesn't gain us much over what we
already have with functools.partial]
>
> If you'd like to convince me otherwise, you're going to need
> to post some use cases from real code.

On Jan 24, 2008 1:36 PM, Aaron Brady <castironpi at comcast.net> wrote:
> curry= partial( partial, prepartial ) [1]
>
> [1] http://www.smlnj.org/

I don't consider that a use case, or real code. ;-)  Yes, you can
construct curry with it.  But what do you want to use curry for?  Show
me some actual Python packages that use the curry function (or your
prepartial function) and then we can talk.

STeVe
-- 
I'm not *in*-sane. Indeed, I am so far *out* of sane that you appear a
tiny blip on the distant coast of sanity.
        --- Bucky Katt, Get Fuzzy


From castironpi at comcast.net  Thu Jan 24 21:56:28 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Thu, 24 Jan 2008 14:56:28 -0600
Subject: [Python-ideas] An easier syntax for writing decorators(&similar
	things)?
In-Reply-To: <d11dcfba0801241253t23b428e9rcbba9748514c8b35@mail.gmail.com>
Message-ID: <20080124205636.43AD41E4015@bag.python.org>

> I don't consider that a use case, or real code. ;-)  Yes, you can
> construct curry with it.  But what do you want to use curry for?  Show
> me some actual Python packages that use the curry function (or your
> prepartial function) and then we can talk.

That's a, who uses partial?



From castironpi at comcast.net  Thu Jan 24 21:57:42 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Thu, 24 Jan 2008 14:57:42 -0600
Subject: [Python-ideas] adopt an enum type for the standard library?
In-Reply-To: <002501c85e67$e62f4880$6800a8c0@RaymondLaptop1>
Message-ID: <20080124205748.D276F1E4013@bag.python.org>

> Something like "wheres= Enum( UP= 0, DOWN= 0, LEFT= 0, RIGHT= 0 )" is
> easily written as:
> 
>     class wheres:
          @staticmethod
          def reassign():
              wheredir= [ v for v in dir( wheres ) if
                  not callable( getattr( wheres, v ) ) and
                  not v.startswith( '_' ) ]
              for i, v in enumerate( wheredir ):
                  setattr( wheres, v, i )
>         UP = 0; DOWN = 0; LEFT = 0; RIGHT = 0
      wheres.reassign()

We typically reassign enum values once they're assigned, and sometimes just
add new ones.

I tend to write new classes where I think people use enums.  20% longer
code; 80% fewer ifs. [1]

[1] http://en.wikipedia.org/wiki/Visitor_pattern




From guido at python.org  Thu Jan 24 23:15:40 2008
From: guido at python.org (Guido van Rossum)
Date: Thu, 24 Jan 2008 14:15:40 -0800
Subject: [Python-ideas] An easier syntax for writing decorators(&similar
	things)?
In-Reply-To: <20080124203723.C1E0F1E4013@bag.python.org>
References: <d11dcfba0801240844q474b4df6yd918bbdd4b3db689@mail.gmail.com>
	<20080124203723.C1E0F1E4013@bag.python.org>
Message-ID: <ca471dc20801241415u187c2c15y1127410ba9e6f7f6@mail.gmail.com>

Aaron, I don't know who you are, and I don't know what you are doing
here, but this type of post with monosyllabic answers really pisses me
off. I suggest you take it somewhere else, or learn to have a normal
discussion. Where did you learn this type of behavior? If you think
this is normal or even acceptable, well, it isn't. Also, you seem to
be abusing this list for a purpose it isn't meant for. You risk being
kicked off.

On Jan 24, 2008 12:36 PM, Aaron Brady <castironpi at comcast.net> wrote:
> > > > > +1 on prepartial in functools.
> > > >
> > > > So far, I've only seen this one use case for prepartial - and I'd
> > > > rather have decorator_withargs itself than prepartial.  Do you have
> > > > any other use cases for prepartial (from real code somewhere)?
> > > >
> > > > Currently, I'm -1 on adding prepartial to functools, and +0.5 on
> > > > adding something like decorator_withargs.
> > >
> > > Are you also -1 on partial's being in functools?
> >
> > Not that it matters, since it's already there, but no, I wasn't.
>
> Good.  I was.  It got in.
>
> > In
> > fact, the presence of partial is a big reason not to need prepartial
>
> Fallacy.
>
> > -- most existing use cases are already covered by using partial.
>
> Lacks citation.
>
> > As
> > Arnaud pointed out, if you restrict decorator_withargs to keyword
> > arguments,
>
> You don't.
>
> > Thus, I don't see prepartial as really covering many new use
> > cases.
>
> Syllogism:
>
> A -> B
> A
> _____
> B
>
> Thus what again?
>
> > If you'd like to convince me otherwise, you're going to need
> > to post some use cases from real code.
>
> curry= partial( partial, prepartial ) [1]
>
> [1] http://www.smlnj.org/
>
>
>
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From castironpi at comcast.net  Thu Jan 24 23:26:00 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Thu, 24 Jan 2008 16:26:00 -0600
Subject: [Python-ideas] An easier syntax for writing decorators(&similar
	things)?
In-Reply-To: <d11dcfba0801241253t23b428e9rcbba9748514c8b35@mail.gmail.com>
Message-ID: <20080124222606.C98C81E4013@bag.python.org>

> I don't consider that a use case, or real code. ;-)  Yes, you can
> construct curry with it.  But what do you want to use curry for?  Show
> me some actual Python packages that use the curry function (or your
> prepartial function) and then we can talk.

Original function:
1.	urlparse( urlstring[, default_scheme[, allow_fragments]])
2.	urlretrieve( url[, filename[, reporthook[, data]]])

Prepartial in action:
1.	parseA= prepartial( 'ftp', False )
2.	retrieveA= prepartial( 'temp.htm', callbackA )

Equivalent:
1.	parseAB= partial( default_scheme= 'ftp', allow_fragments= True )
2.	retrieveAB= partial( filename= 'temp.htm', reporthook= callbackA )

Equivalent calls:
1.	parseA( 'www.cwi.nl/%7Eguido/Python.html' )
2.	retrieveA( 'http://python.org/' )

Motto:
- Programmer time is important
- The stdlib does not contain application-level design



From castironpi at comcast.net  Thu Jan 24 23:30:06 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Thu, 24 Jan 2008 16:30:06 -0600
Subject: [Python-ideas] An easier syntax for writing decorators(&similar
	things)?
In-Reply-To: <ca471dc20801241415u187c2c15y1127410ba9e6f7f6@mail.gmail.com>
Message-ID: <20080124223013.05B7E1E400F@bag.python.org>

The personal dispute is tabled.

Confer:
http://mail.python.org/pipermail/python-ideas/2008-January/001357.html
(Thu Jan 24 23:26:00 CET 2008).



From steven.bethard at gmail.com  Fri Jan 25 01:22:41 2008
From: steven.bethard at gmail.com (Steven Bethard)
Date: Thu, 24 Jan 2008 17:22:41 -0700
Subject: [Python-ideas] An easier syntax for writing decorators(&similar
	things)?
In-Reply-To: <4799107d.265f260a.4d8d.ffffcae3SMTPIN_ADDED@mx.google.com>
References: <d11dcfba0801241253t23b428e9rcbba9748514c8b35@mail.gmail.com>
	<4799107d.265f260a.4d8d.ffffcae3SMTPIN_ADDED@mx.google.com>
Message-ID: <d11dcfba0801241622o66cb3671w6c8170583e713aee@mail.gmail.com>

On Jan 24, 2008 3:26 PM, Aaron Brady <castironpi at comcast.net> wrote:
> > I don't consider that a use case, or real code. ;-)  Yes, you can
> > construct curry with it.  But what do you want to use curry for?  Show
> > me some actual Python packages that use the curry function (or your
> > prepartial function) and then we can talk.
>
> Original function:
> 1.      urlparse( urlstring[, default_scheme[, allow_fragments]])
> 2.      urlretrieve( url[, filename[, reporthook[, data]]])
>
> Prepartial in action:
> 1.      parseA= prepartial( 'ftp', False )
> 2.      retrieveA= prepartial( 'temp.htm', callbackA )
>
> Equivalent:
> 1.      parseAB= partial( default_scheme= 'ftp', allow_fragments= True )
> 2.      retrieveAB= partial( filename= 'temp.htm', reporthook= callbackA )

This is closer to what I'm asking for but these are still not
instances of real code[1]. To build a convincing argument for a new
language feature, you need to show not only a theoretical use case,
but a practical one.  In general, that means you need to show that it
is a frequent need in a large code base.  Finding a bunch of examples
of it in the standard library or big applications like Zope or Twisted
would be a good start.  This is something that's expected of any
PEP-worthy idea.

[1] I know you didn't copy these from real code anywhere because they
all have typos -- they're missing urlparse/urlretrieve as the first
arguments to partial() or prepartial().

STeVe
-- 
I'm not *in*-sane. Indeed, I am so far *out* of sane that you appear a
tiny blip on the distant coast of sanity.
        --- Bucky Katt, Get Fuzzy


From castironpi at comcast.net  Fri Jan 25 03:03:38 2008
From: castironpi at comcast.net (Aaron Brady)
Date: Thu, 24 Jan 2008 20:03:38 -0600
Subject: [Python-ideas] An easier syntax for writing decorators(&similar
	things)?
In-Reply-To: <d11dcfba0801241622o66cb3671w6c8170583e713aee@mail.gmail.com>
Message-ID: <20080125020353.BAB621E400F@bag.python.org>

> [1] I know you didn't copy these from real code anywhere because they
> all have typos -- they're missing urlparse/urlretrieve as the first
> arguments to partial() or prepartial().

Repair treated.

Original function:
1.      urlparse( urlstring[, default_scheme[, allow_fragments]])
2.      urlretrieve( url[, filename[, reporthook[, data]]])

In action:
1.      parseA= prepartial( urlparse, 'ftp', False )
2.      retrieveA= prepartial( urlretrieve, 'temp.htm', callbackA )

Equivalent A:
1.      parseAB= partial( urlparse,
            default_scheme= 'ftp', allow_fragments= False )
2.      retrieveAB= partial( urlretrieve,
            filename= 'temp.htm', reporthook= callbackA )

Equivalent B:
1.      parseAC= lambda url: urlparse( url, 'ftp', False )
2.      retrieveAC= lambda url: urlretrieve( url, 'temp.htm', callbackA )

Examples in gestation; check back in several months; a +fraction; you're on
the right track.



From ntoronto at cs.byu.edu  Sat Jan 26 05:27:58 2008
From: ntoronto at cs.byu.edu (Neil Toronto)
Date: Fri, 25 Jan 2008 21:27:58 -0700
Subject: [Python-ideas] A number is a number
Message-ID: <479AB6CE.4080206@cs.byu.edu>

Isn't it?

Python 3.0 integers are just integers. Now ceil and floor return ints 
from float arguments. It seems like things are moving closer to just 
having a single "number" type. For most applications, because of duck 
typing, you'd never know or care whether you're punting a float, int, or 
complex number around, and you'd usually not care.

Why not just go whole-hog and unify them all?

Neil


From guido at python.org  Sat Jan 26 05:30:21 2008
From: guido at python.org (Guido van Rossum)
Date: Fri, 25 Jan 2008 20:30:21 -0800
Subject: [Python-ideas] A number is a number
In-Reply-To: <479AB6CE.4080206@cs.byu.edu>
References: <479AB6CE.4080206@cs.byu.edu>
Message-ID: <ca471dc20801252030w5ac4f8e5y6c7c5ff63961dc22@mail.gmail.com>

On Jan 25, 2008 8:27 PM, Neil Toronto <ntoronto at cs.byu.edu> wrote:
> Isn't it?
>
> Python 3.0 integers are just integers. Now ceil and floor return ints
> from float arguments. It seems like things are moving closer to just
> having a single "number" type. For most applications, because of duck
> typing, you'd never know or care whether you're punting a float, int, or
> complex number around, and you'd usually not care.
>
> Why not just go whole-hog and unify them all?

Because you don't want floats to be accidentally used as list indices.
And because range(0, 1, 0.1) would be poorly defined. And because you
can't order complex numbers.

For many purposes you never have to care about what kinds of numbers
you're manipulating. But there are still plenty of cases where it does
matter, and then the different types help.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From van.lindberg at gmail.com  Sat Jan 26 18:18:12 2008
From: van.lindberg at gmail.com (VanL)
Date: Sat, 26 Jan 2008 11:18:12 -0600
Subject: [Python-ideas] More on set literals
Message-ID: <fnfq0l$r44$1@ger.gmane.org>


I like Raymond's suggestion (apparently adopted) that {1,2,3} is a 
frozenset literal. However, I also like the {{1,2,3}} syntax. So I would 
propose that {{}} become the literal for the *mutable* set, not the 
frozenset.

So:
frozenset() # empty frozenset
{1,2,3} # frozenset literal

{} # empty dict
{1:2, 3:4} #dict literal

{{}} (or set()) # empty mutable set
{{1,2,3}} # set literal

My rationale is as follows:
1. It visually distinguishes sets and frozensets, without making them 
take up a lot of space.  If people see the {{}}, they will be reminded 
that this set is mutable.

2. In Raymond's example, his point was that most people don't want a 
mutable literal - they would be better served by a frozenset.  However, 
sometimes you *do* want to add elements to a set literal.  For example:

# I am parsing the configuration file for a pure-python webserver.
# The config file allows me to add new filetype extensions
# that will be served.

# Default HTML extensions:
HTML_EXTS = {{'.html', '.htm'}}

# later, when going through config file:
if filetype.handler == HTMLHandler:
     HTML_EXTS.add(filetype.ext)

I know that this can be done other ways (set(['.html', '.htm'])), but I 
like the way this looks.

Thanks,

Van

P.S.: The bikeshed should be yellow.



From adam at atlas.st  Sat Jan 26 18:28:18 2008
From: adam at atlas.st (Adam Atlas)
Date: Sat, 26 Jan 2008 12:28:18 -0500
Subject: [Python-ideas] More on set literals
In-Reply-To: <fnfq0l$r44$1@ger.gmane.org>
References: <fnfq0l$r44$1@ger.gmane.org>
Message-ID: <73A9AF43-6A45-439F-85AC-E5B19E48AB0A@atlas.st>


On 26 Jan 2008, at 12:18, VanL wrote:
> I like Raymond's suggestion (apparently adopted) that {1,2,3} is a
> frozenset literal. However, I also like the {{1,2,3}} syntax. So I  
> would
> propose that {{}} become the literal for the *mutable* set, not the
> frozenset.

How would this be distinguished from a singleton frozenset containing  
another frozenset?


From lists at cheimes.de  Sat Jan 26 18:43:09 2008
From: lists at cheimes.de (Christian Heimes)
Date: Sat, 26 Jan 2008 18:43:09 +0100
Subject: [Python-ideas] More on set literals
In-Reply-To: <fnfq0l$r44$1@ger.gmane.org>
References: <fnfq0l$r44$1@ger.gmane.org>
Message-ID: <fnfrfe$to1$2@ger.gmane.org>

VanL wrote:
> {{}} (or set()) # empty mutable set
> {{1,2,3}} # set literal

{{ }} is ambiguous, hard to read and only one to three characters
shorter than set() / set({}). -1 from me

Christian



From guido at python.org  Sat Jan 26 18:53:15 2008
From: guido at python.org (Guido van Rossum)
Date: Sat, 26 Jan 2008 09:53:15 -0800
Subject: [Python-ideas] More on set literals
In-Reply-To: <fnfrfe$to1$2@ger.gmane.org>
References: <fnfq0l$r44$1@ger.gmane.org> <fnfrfe$to1$2@ger.gmane.org>
Message-ID: <ca471dc20801260953x16a67752o875d25948b59be62@mail.gmail.com>

On Jan 26, 2008 9:43 AM, Christian Heimes <lists at cheimes.de> wrote:
> VanL wrote:
> > {{}} (or set()) # empty mutable set
> > {{1,2,3}} # set literal
>
> {{ }} is ambiguous, hard to read and only one to three characters
> shorter than set() / set({}). -1 from me

Right. This is not even on the table.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From veloso at verylowsodium.com  Sat Jan 26 19:37:00 2008
From: veloso at verylowsodium.com (Greg Falcon)
Date: Sat, 26 Jan 2008 13:37:00 -0500
Subject: [Python-ideas] More on set literals
In-Reply-To: <fnfq0l$r44$1@ger.gmane.org>
References: <fnfq0l$r44$1@ger.gmane.org>
Message-ID: <3cdcefb80801261037k4e8ffcd6w68abdc3492d72969@mail.gmail.com>

On 1/26/08, VanL <van.lindberg at gmail.com> wrote:
> I like Raymond's suggestion (apparently adopted) that {1,2,3} is a
> frozenset literal. However, I also like the {{1,2,3}} syntax. So I would
> propose that {{}} become the literal for the *mutable* set, not the
> frozenset.

As Adam Atlas has already pointed out, {{}} already has a meaning in
the newly proposed scheme.

But that's not really the point that needs to be made.  The typical
response to this type of suggestion is to point out the obvious
technical flaw, and that's a shame, because I don't think that's the
real problem.  You could have instead suggested, say, {|1, 2, 3|} as
the new syntax for set literals.  That fixes the technical flaw, but
it still would have been, I believe, a bad idea.

I don't understand the instinct people have that every concept needs
to be generalized, that every non-uniformity in the language needs
fixing.  Normally the target is adding statements to lambdas, but
today our bikeshed is the set literal syntax.  There seems to be an
underlying assumption to these posts that I find irritiating -- that
the Python designers and developers would have added these features if
only they were smart enough to be able to think up a syntax for them
without assistance!

Raymond Hettinger's frozenset literal proposal is solid.  It
recognizes that sets created from literal notation will typically not
be mutated, that set({1,2,3}) is not a burdensome syntax in the off
cases, and that literal frozensets admit a new and faster idiom for
comparing a value against a small set of constants.

The one flaw with his proposal was his prediction:

On 1/24/08, Raymond Hettinger <python at rcn.com> wrote:
> P.S.  A small side-benefit is it may put an end for interminable requests for a {:} or {/} notation for empty sets.  There's not much need for a literal for a empty frozenset (use "not s" instead).

It turned out to only take about two days.

Greg F


From taroso at gmail.com  Mon Jan 28 06:48:32 2008
From: taroso at gmail.com (Taro)
Date: Mon, 28 Jan 2008 16:48:32 +1100
Subject: [Python-ideas] non-Pre-PEP: Syntactic replacement of built-in types
	with user-defined callables.
Message-ID: <fefcdfb70801272148g78ba3e59gbefbb874097c805e@mail.gmail.com>

Hi all,



Built-in types such as float, string, or list are first-class citizens in
Python sourcefiles, having syntactic support:

    myfloat = 1.0

    mystring = "my string"

    mylist = [1,2,4,8]

    mydict = {1:"a", 2:"b", 3:"c"}

    myset = {1,2,3}



User-defined classes are second-class citizens, requiring data to be
manually converted from a type:

    mydecimal = Decimal("1.00000000000000001")

    myrope = Rope("my rope")

    myblist = BList([1,2,4,8])

    myordereddict = OrderedDict((1,"a"), (2, "b"), (3, "c"))

    myfrozenset = frozenset([1,2,3])



If there's only one or two conversions needed in a file, then such
conversion is not particularly burdensome, but if one wants to consistently
use (say) decimals throughout a file then the ability to use a literal
syntax makes for prettier source.



Some languages have open types, allowing the addition of methods to built-in
types.  This is not considered desired behaviour for Python, since
modifications made in one module can potentially affect code in other
modules.



A typedef is syntactic sugar to allow user-defined replacements to be
treated as first-class citizens in code.  They affect only the module in
which they appear and do not modify the original type.  To be typedeffable,
something must be a builtin type, have a constant/syntactic representation,
and be callable.  Hence, typedeffable types would be limited to complex,
dict, float, int, ?object?, list, slice, set, string, and tuple.   No
modification is made to __builtins__ or types, so conversion and/or
reference to the original type is still possible.



The syntax for a typedef is:

    from MODULE typedef ADAPTOR as TYPE

OR

    typedef BUILTINADAPTOR as TYPE



Syntactic constants of a given type are then wrapped with a call:

    ADAPTOR(SOURCELITERAL)

where SOURCELITERAL is the string that appears in the sourcecode

eg:

    from decimal typedef Decimal as float

    i = 1.000000000000000000000000000001

translates as:

    from decimal import Decimal as float

    i = Decimal("1.000000000000000000000000000001")



Syntactic collections of a given type are always provided with a list of
objects, eg:

    from decimal typedef Decimal as float

    from blist typedef BList as list

    i = 1.000000000000000000000000000001

    b = [1.1, 4.2]

translates as:

    from decimal import Decimal as float

    from blist import BList as list

    b = Blist([Decimal("1.1"), Decimal("4.2")])



and

    from collections typedef OrderedDict as dict

    d = {1:"a", 2:"b", 3:"c"}

as:

    from collections import OrderedDict as dict

    d = OrderedDict([(1,"a"), (2,"b"), (3,"c")])


A typedef appears at the start of a module immediately after any __future__
imports.  As no adaptors can be defined in a module before a typedef and
typedefs are in no way a forward declaration, "typedef ADAPTOR as TYPE" only
works for builtins, since to do otherwise would lead to one of two
unpalatable options; either:

a/ definition of adaptors would have to be allowed pre-typedef, which would
allow them to be buried in code, making them far easier to miss; or

b/ adaptors would be defined after the typedef, which means that you'd have
to handle:

    typedef float as float

   def float():

      pass

or:

    typedef MyObject as object

   class MyObject(object):

       pass

or:

    typedef myfloat as float

    x = 1.1

    def myfloat():

       pass



It is true that if a valid typedef is made, the type can be redefined within
the module; but the "consenting adults" rule applies -- it's possible to
redefine str and float multiple times within a module as well, but that
isn't recommended either (and the typedef at the top of the module at least
indicates that non-standard behaviour is to be expected)



It is a SyntaxError to typedef the same type more than once:

    from decimal typedef Decimal as float

    from types typedef FloatType as float   #SyntaxError("Type 'float'
already redefined.")



Spelling:  "typedef" is prettier than "pragma", and less likely to be in use
than "use" but its behaviour is somewhat different from C's typedef, so
perhaps another term may be preferred.



Theoretical Performance Differences:  Since a typedef is purely syntactic
sugar, and all tranformational work would be done at compilation, running
code should be no slower (there should be no new opcodes necessary) and by
default no faster that performing manual conversion (though they may assist
an optimisation).  Unless optimisation magic is available,
performance-critical code should be careful when typedeffing not to use
typedeffed literals in inner loops.



I know it's considered polite to provide code, but any implementation would
have to be in C, so please accept this extremely fake dummy implementation
which in no way resembles the way things really work as a poor substitute:



typedefs = { FLOATTYPE: None,

                     INTTYPE: None,

                     LISTTYPE: None,

                     DICTTYPE: None,

                     ... }



while True

    line = lines.next()

    type, module, adaptor = parsetypedef(line)

    if type is None:

        break

    if typedefs[type] is not None:

        raise SyntaxError("Typedef redef")

    typedefs[type] = adaptor

    if module is not None:

        emit_bytecode_for_import_from(type, module, adaptor)

    else:

        emit_bytecode_for_assignment(type, adaptor)





parse([line] + lines)

...



def emit_float(...):

    if typedefs[FLOATTYPE] is not None:

        emit_constant(typedefs[FLOATTYPE][0], stringliteral)

    else:

        ... # standard behaviour



def emit_list(...):

    if typedefs[LISTTYPE] is not None:

        emit(LOADGLOBAL, typedefs[LISTTYPE])

    # standard behaviour

    if typedefs[LISTTYPE] is not None:

        emit(CALL_FUNCTION)

    # standard behaviour



All rights are assigned to the Python Software Foundation
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20080128/4fec1db9/attachment.html>

From taroso at gmail.com  Mon Jan 28 07:01:19 2008
From: taroso at gmail.com (Taro)
Date: Mon, 28 Jan 2008 17:01:19 +1100
Subject: [Python-ideas] non-Pre-PEP: Syntactic replacement of built-in types
	with user-defined callables (Take 2... fewer newlines!)
Message-ID: <fefcdfb70801272201y10d57ebdo39d559ca5bc6e8a2@mail.gmail.com>

Hi all... again,

Built-in types such as float, string, or list are first-class citizens
in Python sourcefiles, having syntactic support:
    myfloat = 1.0
    mystring = "my string"
    mylist = [1,2,4,8]
    mydict = {1:"a", 2:"b", 3:"c"}
    myset = {1,2,3}

User-defined classes are second-class citizens, requiring data to be
manually converted from a type:
    mydecimal = Decimal("1.00000000000000001")
    myrope = Rope("my rope")
    myblist = BList([1,2,4,8])
    myordereddict = OrderedDict((1,"a"), (2, "b"), (3, "c"))
    myfrozenset = frozenset([1,2,3])

If there's only one or two conversions needed in a file, then such
conversion is not particularly burdensome, but if one wants to
consistently use (say) decimals throughout a file then the ability to
use a literal syntax makes for prettier source.

Some languages have open types, allowing the addition of methods to
built-in types.  This is not considered desired behaviour for Python,
since modifications made in one module can potentially affect code in
other modules.

A typedef is syntactic sugar to allow user-defined replacements to be
treated as first-class citizens in code.  They affect only the module
in which they appear and do not modify the original type.  To be
typedeffable, something must be a builtin type, have a
constant/syntactic representation, and be callable.  Hence,
typedeffable types would be limited to complex, dict, float, int,
?object?, list, slice, set, string, and tuple.   No modification is
made to __builtins__ or types, so conversion and/or reference to the
original type is still possible.

The syntax for a typedef is:
    from MODULE typedef ADAPTOR as TYPE
OR
    typedef BUILTINADAPTOR as TYPE

Syntactic constants of a given type are then wrapped with a call:
    ADAPTOR(SOURCELITERAL)
where SOURCELITERAL is the string that appears in the sourcecode

eg:
    from decimal typedef Decimal as float
    i = 1.000000000000000000000000000001

translates as:
    from decimal import Decimal as float
    i = Decimal("1.000000000000000000000000000001")

Syntactic collections of a given type are always provided with a list
of objects, eg:
    from decimal typedef Decimal as float
    from blist typedef BList as list
    b = [1.1, 4.2]

translates as:
    from decimal import Decimal as float
    from blist import BList as list
    b = Blist([Decimal("1.1"), Decimal("4.2")])

and

    from collections typedef OrderedDict as dict
    d = {1:"a", 2:"b", 3:"c"}

as:

    from collections import OrderedDict as dict
    d = OrderedDict([(1,"a"), (2,"b"), (3,"c")])

A typedef appears at the start of a module immediately after any
__future__ imports.  As no adaptors can be defined in a module before
a typedef and typedefs are in no way a forward declaration, "typedef
ADAPTOR as TYPE" only works for builtins, since to do otherwise would
lead to one of two unpalatable options; either:

a/ definition of adaptors would have to be allowed pre-typedef, which
would allow them to be buried in code, making them far easier to miss;
or
b/ adaptors would be defined after the typedef, which means that you'd
have to handle:
    typedef float as float
    def float():
        pass

or:
    typedef MyObject as object
    class MyObject(object):
        pass

or:
    typedef myfloat as float
    x = 1.1
    def myfloat():
       pass

It is true that if a valid typedef is made, the type can be redefined
within the module; but the "consenting adults" rule applies -- it's
possible to redefine str and float multiple times within a module as
well, but that isn't recommended either (and the typedef at the top of
the module at least indicates that non-standard behaviour is to be
expected)

It is a SyntaxError to typedef the same type more than once:
    from decimal typedef Decimal as float
    from types typedef FloatType as float   #SyntaxError("Type 'float'
already redefined.")

Spelling:  "typedef" is prettier than "pragma", and less likely to be
in use than "use" but its behaviour is somewhat different from C's
typedef, so perhaps another term may be preferred.

Theoretical Performance Differences:  Since a typedef is purely
syntactic sugar, and all tranformational work would be done at
compilation, running code should be no slower (there should be no new
opcodes necessary) and by default no faster that performing manual
conversion (though they may assist an optimisation).  Unless
optimisation magic is available, performance-critical code should be
careful when typedeffing not to use typedeffed literals in inner
loops.

I know it's considered polite to provide code, but any implementation
would have to be in C, so please accept this extremely fake dummy
implementation which in no way resembles the way things really work as
a poor substitute:

typedefs = { FLOATTYPE: None,
             INTTYPE: None,
             LISTTYPE: None,
             DICTTYPE: None,
             ... }

while True
    line = lines.next()
    type, module, adaptor = parsetypedef(line)
    if type is None:
        break
    if typedefs[type] is not None:
        raise SyntaxError("Typedef redef")
    typedefs[type] = adaptor

    if module is not None:
        emit_bytecode_for_import_from(type, module, adaptor)
    else:
        emit_bytecode_for_assignment(type, adaptor)

parse([line] + lines)
...

def emit_float(...):
    if typedefs[FLOATTYPE] is not None:
        emit_constant(typedefs[FLOATTYPE][0], stringliteral)
    else:
        ... # standard behaviour

def emit_list(...):
    if typedefs[LISTTYPE] is not None:
        emit(LOADGLOBAL, typedefs[LISTTYPE])
    # standard behaviour

    if typedefs[LISTTYPE] is not None:
        emit(CALL_FUNCTION)
    # standard behaviour

All rights are assigned to the Python Software Foundation


From matt-python at theory.org  Mon Jan 28 11:01:28 2008
From: matt-python at theory.org (Matt Chisholm)
Date: Mon, 28 Jan 2008 02:01:28 -0800
Subject: [Python-ideas] non-Pre-PEP: Syntactic replacement of built-in
	pes with	user-defined callables.
In-Reply-To: <mailman.5195.1201500085.894.python-ideas@python.org>
References: <mailman.5195.1201500085.894.python-ideas@python.org>
Message-ID: <20080128100128.GD6391@tesla.theory.org>

I don't think I followed all of what you wrote.  However, it would
certainly be convenient shorthand to be able to say that, for an
entire file, program, or module, the float literal syntax created
Decimals instead.

And I can imagine cases where an entire app or module was always using
ordered dicts instead of dicts, or frozensets instead of sets.

-matt



From talin at acm.org  Tue Jan 29 04:44:23 2008
From: talin at acm.org (Talin)
Date: Mon, 28 Jan 2008 19:44:23 -0800
Subject: [Python-ideas] non-Pre-PEP: Syntactic replacement of built-in
 types	with user-defined callables (Take 2... fewer newlines!)
In-Reply-To: <fefcdfb70801272201y10d57ebdo39d559ca5bc6e8a2@mail.gmail.com>
References: <fefcdfb70801272201y10d57ebdo39d559ca5bc6e8a2@mail.gmail.com>
Message-ID: <479EA117.5050602@acm.org>

Taro wrote:
> Hi all... again,
> 
> Built-in types such as float, string, or list are first-class citizens
> in Python sourcefiles, having syntactic support:
>     myfloat = 1.0
>     mystring = "my string"
>     mylist = [1,2,4,8]
>     mydict = {1:"a", 2:"b", 3:"c"}
>     myset = {1,2,3}
> 
> User-defined classes are second-class citizens, requiring data to be
> manually converted from a type:
>     mydecimal = Decimal("1.00000000000000001")
>     myrope = Rope("my rope")
>     myblist = BList([1,2,4,8])
>     myordereddict = OrderedDict((1,"a"), (2, "b"), (3, "c"))
>     myfrozenset = frozenset([1,2,3])

The problem I see with this is, what if you need to use both decimals 
and floats together?

I've often thought that there should be a shorter spelling for decimal 
numbers, but I was thinking that a simple suffix letter would suffice:

     mydecimal = 1.0000000000000001d

(This assumes of course that the compiler knows how to form a decimal 
constant, although the actual construction of the constant could be 
deferred until runtime.)

And if you really are suffering repetitive strain injury from having to 
type 'OrderedDict' 100 times in your code, it seems to me that you could 
just avoid creating each one individually, and instead have an array of 
inputs which gets converted to an array of OrderedDict. In other words, 
one generally doesn't see code where the same repeated item is assigned 
to 100 individual variables; Most programmers, when they see more than 
half a dozen similar items, will start thinking about ways in which they 
can roll up the definitions into an algorithm to generate them.

-- Talin


From jimjjewett at gmail.com  Tue Jan 29 14:34:32 2008
From: jimjjewett at gmail.com (Jim Jewett)
Date: Tue, 29 Jan 2008 08:34:32 -0500
Subject: [Python-ideas] non-Pre-PEP: Syntactic replacement of built-in
	types with user-defined callables (Take 2... fewer newlines!)
In-Reply-To: <479EA117.5050602@acm.org>
References: <fefcdfb70801272201y10d57ebdo39d559ca5bc6e8a2@mail.gmail.com>
	<479EA117.5050602@acm.org>
Message-ID: <fb6fbf560801290534q6bde6943vf837fef860311903@mail.gmail.com>

On 1/28/08, Talin <talin at acm.org> wrote:
> Taro wrote:
> > Built-in types such as float, string, or list are first-class citizens ...

> >     myfloat = 1.0

> > User-defined classes are second-class citizens, requiring data to be
> > manually converted from a type:
> >     mydecimal = Decimal("1.00000000000000001")

> The problem I see with this is, what if you need to use both decimals
> and floats together?

Then don't overwrite the float type.  :D

> I've often thought that there should be a shorter spelling for decimal
> numbers, but I was thinking that a simple suffix letter would suffice:

>      mydecimal = 1.0000000000000001d

That solves a slightly different problem.  It expands the set of known
types to include Decimal, but it doesn't let you say:

    For this run, make all strings be instances of MyString,
    which is a string subclass with extra logging to help me debug.

> And if you really are suffering repetitive strain injury from having to
> type 'OrderedDict' 100 times in your code, it seems to me that you could
> just avoid creating each one individually, and instead have an array of
> inputs which gets converted to an array of OrderedDict.

When I have wanted this, I didn't have an array; I had existing code
with string literals sprinkled throughout.  I wanted to minimize
(diff-visible) changes because I knew I would back them out once the
debugging or testing were done.

-jJ


From stephen at xemacs.org  Tue Jan 29 22:59:53 2008
From: stephen at xemacs.org (Stephen J. Turnbull)
Date: Wed, 30 Jan 2008 06:59:53 +0900
Subject: [Python-ideas] non-Pre-PEP: Syntactic replacement of built-in
	types	with user-defined callables (Take 2... fewer newlines!)
In-Reply-To: <fefcdfb70801272201y10d57ebdo39d559ca5bc6e8a2@mail.gmail.com>
References: <fefcdfb70801272201y10d57ebdo39d559ca5bc6e8a2@mail.gmail.com>
Message-ID: <873asgcmxy.fsf@uwakimon.sk.tsukuba.ac.jp>

Taro writes:

 > User-defined classes are second-class citizens, requiring data to be
 > manually converted from a type:
 >     mydecimal = Decimal("1.00000000000000001")
 >     myrope = Rope("my rope")
 >     myblist = BList([1,2,4,8])
 >     myordereddict = OrderedDict((1,"a"), (2, "b"), (3, "c"))
 >     myfrozenset = frozenset([1,2,3])

As a matter of presenting the PEP in the best light, this isn't an
issue of "second-class" to me, for one.

ISTM that it's simply a matter of (a) giving syntax to constructions
that are used a lot in control structures, and (b) the historical fact
that other types that have syntax defined for them came first and were
added as built-ins.  Some of them possibly shouldn't have syntax.  As
a strawman example, many Python programs use floats very rarely, and
we could conceivably -- but not *plausibly* -- demote float and
require "my_approximate_pi = float('3.14')".

 > If there's only one or two conversions needed in a file, then such
 > conversion is not particularly burdensome, but if one wants to
 > consistently use (say) decimals throughout a file then the ability to
 > use a literal syntax makes for prettier source.

This is very persuasive, but may not be enough.  Jim's suggestion of a
"logging string" was also interesting.

 > The syntax for a typedef is:
 >     from MODULE typedef ADAPTOR as TYPE

I'm sorry, but I hate this term "typedef".  This is not a "type
definition" in any Pythonic sense of the word "type", and the
associations with C-style typedefs are painful.  How about

    from MODULE import STRINGCONVERTER with TYPE readsyntax
or
    from MODULE import STRINGCONVERTER with readsyntax TYPE
or
    from MODULE import STRINGCONVERTER readsyntax TYPE
?

 > a/ definition of adaptors would have to be allowed pre-typedef, which
 > would allow them to be buried in code, making them far easier to miss;

I don't see why this is a problem.  An adaptor for a scalar TYPE is
just a converter from string to that TYPE.  If you have a such a
converter (eg, because you're using Python as the platform for
translating another language), why not just use it here?

Since the way this would work is that each TYPE would have a
string-to-TYPE-value converter, you would just do (inside the
compiler)

TYPE.stringconverter = STRINGCONVERTER

and if the compiler encountered a literal with an undefined
.stringconverter, it would generate an error to the effect of

    Use of TYPE stringconverter 'STRINGCONVERTER' before definition.

I think this would not be a problem in practice (except for typos)
because you'd write decimal.py like this:

### decimal.py --- class Decimal for multiprecision decimal arithmetic
from decimal import string_to_decimal with float readsyntax

class Decimal:
    ...

def string_to_Decimal (astring):
    ...

# Decimal constants
pi = 3.1415926...
### decimal.py ends here

Other comments:

One problem I see with this proposal is that for non-scalar TYPEs, in
general a user may want to take over the whole process of parsing TYPE
literals.  What I don't see in your proposal is what restrictions you
want to put on that.  It seems like what you have in mind for a MyList
is to convert from a list, but what if in your code you mostly want
floats to be floats, but in MyLists they should be Decimals?  This
would mean loss of precision:

    string -> list of floats -> MyList of Decimals

Don't take the example too seriously, we can discuss later whether
there would be real use cases like this.  What I want to know is how
you see this case working.  Take over parsing the whole string?  Or
let the list type parse the string, and then convert?  In the latter
case, I think EIBTI applies.  The former seems rather unlikely to fly
as a PEP.

 > I know it's considered polite to provide code, but any implementation
 > would have to be in C,

No, there's PyPy.

 > so please accept this extremely fake dummy
 > implementation which in no way resembles the way things really work as
 > a poor substitute:

You might as well omit this; it doesn't really even help answer my
question above.

(IANALL but ISTM that) an issue here is at what point does the
compiler learn that a syntactic list is actually a literal, and your
code doesn't help indicate that, or whether it would need to differ
from the current compiler, either.


From tjreedy at udel.edu  Wed Jan 30 04:01:57 2008
From: tjreedy at udel.edu (Terry Reedy)
Date: Tue, 29 Jan 2008 22:01:57 -0500
Subject: [Python-ideas] non-Pre-PEP: Syntactic replacement of built-in
	typeswith user-defined callables (Take 2... fewer newlines!)
References: <fefcdfb70801272201y10d57ebdo39d559ca5bc6e8a2@mail.gmail.com>
Message-ID: <fnopb5$ie8$1@ger.gmane.org>


"Taro" <taroso at gmail.com> wrote in message 
news:fefcdfb70801272201y10d57ebdo39d559ca5bc6e8a2 at mail.gmail.com...

| Theoretical Performance Differences:  Since a typedef is purely
| syntactic sugar, and all tranformational work would be done at
| compilation,

I think this is a key issue which, unfortunately, I believe, works against 
your proposal.  If the alternative constructor is a builtin C function, 
then yes, the transformation could be done at compile time.  But I think 
this much more difficult for one written in Python.  As far as I know, the 
byte-code interpreter is not normally running during compilation.  From 
__future__ imports work precisely because they are not really imports.  But 
alternate constructor imports would have to be.  I don't know enough, 
though, to know if this could be made to work.

tjr







From taroso at gmail.com  Thu Jan 31 10:29:01 2008
From: taroso at gmail.com (Taro)
Date: Thu, 31 Jan 2008 20:29:01 +1100
Subject: [Python-ideas] non-Pre-PEP: Syntactic replacement of built-in
	types with user-defined callables (Take 2... fewer newlines!)
In-Reply-To: <479EA117.5050602@acm.org>
References: <fefcdfb70801272201y10d57ebdo39d559ca5bc6e8a2@mail.gmail.com>
	<479EA117.5050602@acm.org>
Message-ID: <fefcdfb70801310129y7acc7deo9adbf66434d4a876@mail.gmail.com>

Talin, hi

On Jan 29, 2008 2:44 PM, Talin <talin at acm.org> wrote:
> The problem I see with this is, what if you need to use both decimals
> and floats together?
If you need decimals and floats together, then you'd need to pick one
and convert the other, eg:
    from decimal typedef Decimal as float
    ofloat = __builtins__.float
    mydecimals = [1.00000000000000000000001, ..., 99.000000000000000009]
    myfloat = ofloat("3.14")

If for some reason you need exactly even quantities of decimals and
floats then at least you're no worse off than you are now -- just
don't typedef.

> And if you really are suffering repetitive strain injury from having to
> type 'OrderedDict' 100 times in your code, it seems to me that you could
> just avoid creating each one individually, and instead have an array of
> inputs which gets converted to an array of OrderedDict. In other words,
> one generally doesn't see code where the same repeated item is assigned
> to 100 individual variables; Most programmers, when they see more than
> half a dozen similar items, will start thinking about ways in which they
> can roll up the definitions into an algorithm to generate them.
But it gets back to the prettiness* factor - it's prettier to read a
mapping of keys to values that's written like a dict than it is to
read a list of 2-tuples, and at least some of the time it's much nicer
to have such a mapping in the sourcefile than converting a datafile.

-T.
(*eye of the beholder blah blah blah ;-)


From taroso at gmail.com  Thu Jan 31 14:38:32 2008
From: taroso at gmail.com (Taro)
Date: Fri, 1 Feb 2008 00:38:32 +1100
Subject: [Python-ideas] non-Pre-PEP: Syntactic replacement of built-in
	types with user-defined callables (Take 2... fewer newlines!)
In-Reply-To: <873asgcmxy.fsf@uwakimon.sk.tsukuba.ac.jp>
References: <fefcdfb70801272201y10d57ebdo39d559ca5bc6e8a2@mail.gmail.com>
	<873asgcmxy.fsf@uwakimon.sk.tsukuba.ac.jp>
Message-ID: <fefcdfb70801310538u7cb94d51w1990442bccecd3bd@mail.gmail.com>

Stephen, hi, and thanks for the critique.

On Jan 30, 2008 8:59 AM, Stephen J. Turnbull <stephen at xemacs.org> wrote:
> This is very persuasive, but may not be enough.  Jim's suggestion of a
> "logging string" was also interesting.
Nod -- changing one line at the beginning of a file could be rather elegant.

> I'm sorry, but I hate this term "typedef".  This is not a "type
> definition" in any Pythonic sense of the word "type", and the
> associations with C-style typedefs are painful.  How about
I'm not particularly enamoured with "typedef" either but the best
alternative I could think of was (ugh) "replacetype". If the idea is
deemed to be workable, then exact spelling can come later - I'll use
"replacingsyntax" until an alternative comes along. Not using "as" as
the preposition is probably a good idea, come to think of it, as there
could be less magic:
    # literal replacement only
    # float() calls __builtins__.float(); Decimal() calls decimal.Decimal()
    from decimal import Decimal replacingsyntax float
and
    # type replacement
    # float() calls decimal.Decimal(); Decimal() raises a NameError
    from decimal import Decimal as float replacingsyntax float

>  > a/ definition of adaptors would have to be allowed pre-typedef, which
>  > would allow them to be buried in code, making them far easier to miss;
> I don't see why this is a problem.  An adaptor for a scalar TYPE is
> just a converter from string to that TYPE.  If you have a such a
> converter (eg, because you're using Python as the platform for
> translating another language), why not just use it here?
>
> Since the way this would work is that each TYPE would have a
> string-to-TYPE-value converter, you would just do (inside the
> compiler)
>
> TYPE.stringconverter = STRINGCONVERTER
>
> and if the compiler encountered a literal with an undefined
> .stringconverter, it would generate an error to the effect of
>
>     Use of TYPE stringconverter 'STRINGCONVERTER' before definition.
[...provided that TYPE.stringconverter is set]

If I understand your intent correctly, as things stand this would
ordinarily be a runtime NameError, and extra code would need to be
added to the compiler to keep track of that:
   from decimal import Decimal with replacingsyntax float as float
   x = 1.01                  # SyntaxError here???
   def Decimal():....


> I think this would not be a problem in practice (except for typos)
> because you'd write decimal.py like this:
> ### decimal.py --- class Decimal for multiprecision decimal arithmetic
> from decimal import string_to_decimal with float readsyntax
I forgot about a module being able to import itself (I can't recall
using it for real, though I guess that mutual-imports count), but that
makes me uncomfortable; I'm not sure that my discomfort is
well-founded, though.

> want to put on that.  It seems like what you have in mind for a MyList
> is to convert from a list, but what if in your code you mostly want
> floats to be floats, but in MyLists they should be Decimals?  This
> would mean loss of precision:
For this, the "solution" to keep precision would be along the lines of:
    ## mycollections.py
    class MyListFloatHelper(float):
        def __init__(self, value):
            self._strvalue = value

    class MyList(list):
        def __init__(self, inlist):
            for elem in inlist:
                try:
                    elem = Decimal(elem._strvalue)
                except AttributeError:
                    pass
                self.append(elem)

    ## main.py
    from mycollections import MyListFloatHelper with readsyntax float
    from mycollections import MyList with readsyntax list
    def foo():
        myfloat = 1.000000000000000000000000000001
        mydlist = [1.0000000000000000000000000001]
        mydecimal = mydlist[0]
        print myfloat,"/", mydecimal
        #==> 1.0 / Decimal("1.0000000000000000000000000001")
        print type(myfloat), "/", type(mydecimal)
        #==> <class 'mycollections.MyListFloatHelper'> / <class
'decimal.Decimal'>

> (IANALL but ISTM that) an issue here is at what point does the
> compiler learn that a syntactic list is actually a literal, and your
> code doesn't help indicate that, or whether it would need to differ
> from the current compiler, either.

It would learn that a token is a literal at the same time it does now.
Determination of the need for literal replacements needs to be made at
the compilation stage so that stringliterals can be provided and extra
opcodes emitted (?an AST Visitor would be too late since the original
literals are already converted?)

    def foo():
        myfloat = 1.000000000000000000000000000001
        mydecimallist = [1.0000000000000000000000000001]
        mydecimal = mydecimallist[0]

    dis.dis(foo)        # Currently
    #==>  2           0 LOAD_CONST               1 (1.0)
    #==>              3 STORE_FAST               0 (myfloat)
    #==>
    #==>  3           6 LOAD_CONST               1 (1.0)
    #==>              9 BUILD_LIST               1
    #==>             12 STORE_FAST               1 (mydecimallist)
    #==>
    #==>  4          15 LOAD_FAST                1 (mydecimallist)
    #==>             18 LOAD_CONST               2 (0)
    #==>             21 BINARY_SUBSCR
    #==>             22 STORE_FAST               2 (mydecimal)
    #==>             25 LOAD_CONST               0 (None)
    #==>             28 RETURN_VALUE


    dis.dis(foo)         # With MyListFloatHelper and MyList
    #==>   2           0 LOAD_GLOBAL              0 (MyListFloatHelper)
    #==>               3 LOAD_CONST               1
('1.0000000000000000000000000001')
    #==>               6 CALL_FUNCTION            1
    #==>               9 STORE_FAST               0 (myfloat)
    #==>
    #==>   3          12 LOAD_GLOBAL              1 (MyList)
    #==>              15 LOAD_GLOBAL              0 (MyListFloatHelper)
    #==>              18 LOAD_CONST               1
('1.0000000000000000000000000001')
    #==>              21 CALL_FUNCTION            1
    #==>              24 BUILD_LIST               1
    #==>              27 CALL_FUNCTION            1
    #==>              30 STORE_FAST               1 (mydecimallist)
    #==>
    #==>   4          33 LOAD_FAST                1 (mydecimallist)
    #==>              36 LOAD_CONST               2 (0)
    #==>              39 BINARY_SUBSCR
    #==>              40 STORE_FAST               2 (mydecimal)
    #==>              43 LOAD_CONST               0 (None)
    #==>              46 RETURN_VALUE


From stargaming at gmail.com  Thu Jan 31 21:17:04 2008
From: stargaming at gmail.com (Stargaming)
Date: Thu, 31 Jan 2008 20:17:04 +0000 (UTC)
Subject: [Python-ideas] non-Pre-PEP: Syntactic replacement of built-in
	types	with user-defined callables (Take 2... fewer newlines!)
References: <fefcdfb70801272201y10d57ebdo39d559ca5bc6e8a2@mail.gmail.com>
Message-ID: <fntac0$7qm$1@ger.gmane.org>

On Mon, 28 Jan 2008 17:01:19 +1100, Taro wrote:
[typedef PEP snipped]

If I understood your proposal correctly, it's just about dynamically 
*replacing* the built-in types (with their respective syntax) with some 
type of your own. 

I doubt this will ever be accepted (apart from the technical 
difficulties) since *changing* built-in types *in-place* has been 
disallowed for exactly the same reason: It could cause weird behaviour 
where you expected some **agreed on, built-in** behaviour without having 
an overly obvious cause.

Cheers,




From python at rcn.com  Thu Jan 31 21:36:26 2008
From: python at rcn.com (Raymond Hettinger)
Date: Thu, 31 Jan 2008 15:36:26 -0500 (EST)
Subject: [Python-ideas] non-Pre-PEP: Syntactic replacement of
 built-in	types	with user-defined callables (Take 2... fewer newlines!)
Message-ID: <20080131153626.AGB97247@ms19.lnh.mail.rcn.net>

> If I understood your proposal correctly, it's just about
> dynamically *replacing* the built-in types (with their 
> respective syntax) with some type of your own. 

FWIW, there is an interesting type replacement recipe on
the bottom of the page at:

     http://docs.python.org/lib/module-tokenize.html


Raymond


From stephen at xemacs.org  Thu Jan 31 22:18:28 2008
From: stephen at xemacs.org (Stephen J. Turnbull)
Date: Fri, 01 Feb 2008 06:18:28 +0900
Subject: [Python-ideas] non-Pre-PEP: Syntactic replacement of built-in
	types with user-defined callables (Take 2... fewer newlines!)
In-Reply-To: <fefcdfb70801310538u7cb94d51w1990442bccecd3bd@mail.gmail.com>
References: <fefcdfb70801272201y10d57ebdo39d559ca5bc6e8a2@mail.gmail.com>
	<873asgcmxy.fsf@uwakimon.sk.tsukuba.ac.jp>
	<fefcdfb70801310538u7cb94d51w1990442bccecd3bd@mail.gmail.com>
Message-ID: <87ve59ae3f.fsf@uwakimon.sk.tsukuba.ac.jp>

Taro writes:

 > > Since the way this would work is that each TYPE would have a
 > > string-to-TYPE-value converter, you would just do (inside the
 > > compiler)
 > >
 > > TYPE.stringconverter = STRINGCONVERTER
 > >
 > > and if the compiler encountered a literal with an undefined
 > > .stringconverter, it would generate an error to the effect of
 > >
 > >     Use of TYPE stringconverter 'STRINGCONVERTER' before definition.
 > [...provided that TYPE.stringconverter is set]
 > 
 > If I understand your intent correctly, as things stand this would
 > ordinarily be a runtime NameError,

That is exactly my intent.  A more explicit error message than
"NameError: name 'myTypeLexer' not defined" would be nice,
although if you give the function a reasonably explicit name it should
be clear enough.

However, a real problem with this idea is that I don't know if the
compiler knows how to call the functions it has just compiled!

Also, as syntax in general this kind of order dependence would be
considered unPythonic, I suspect.

 > and extra code would need to be
 > added to the compiler to keep track of that:
 >    from decimal import Decimal with replacingsyntax float as float
 >    x = 1.01                  # SyntaxError here???
 >    def Decimal():....

No, it would be a NameError, the name of the implicitly called
converter is not defined in scope.  The syntax is correct.

 > For this, the "solution" to keep precision would be along the lines of:

 >     ## mycollections.py
 >     class MyListFloatHelper(float):
 >         def __init__(self, value):
 >             self._strvalue = value
 >
 >     class MyList(list):
 >         def __init__(self, inlist):
 >             for elem in inlist:
 >                 try:
 >                     elem = Decimal(elem._strvalue)
 >                 except AttributeError:
 >                     pass
 >                 self.append(elem)
 > 
 >     ## main.py
 >     from mycollections import MyListFloatHelper with readsyntax float
 >     from mycollections import MyList with readsyntax list
 >     def foo():
 >         myfloat = 1.000000000000000000000000000001
 >         mydlist = [1.0000000000000000000000000001]
 >         mydecimal = mydlist[0]
 >         print myfloat,"/", mydecimal
 >         #==> 1.0 / Decimal("1.0000000000000000000000000001")
 >         print type(myfloat), "/", type(mydecimal)
 >         #==> <class 'mycollections.MyListFloatHelper'> / <class 'decimal.Decimal'>

You've missed the point.  I want myfloat to be a Python builtin float
for some reason.  Only in the context of a dlist do I want that
read syntax to be parsed as a Decimal.

You can just require that literal read syntax replacements be global,
I guess, but you need to say something about whether this is going to
be possible in the proposal.