[Python-ideas] proto-PEP: Fixing Non-constant Default Arguments

Chris Rebert cvrebert at gmail.com
Sun Jan 28 20:22:44 CET 2007

The following is a proto-PEP based on the discussion in the thread
"fixing mutable default argument values". Comments would be greatly
- Chris Rebert

Title: Fixing Non-constant Default Arguments


     This PEP proposes new semantics for default arguments to remove 
boilerplate code associated with non-constant default argument values,
allowing them to be expressed more clearly and succinctly.


     Currently, to write functions using non-constant default arguments, 
one must use the idiom:

     def foo(non_const=None):
         if non_const is None:
             non_const = some_expr
         #rest of function

or equivalent code. Naive programmers desiring mutable default arguments 
often make the mistake of writing the following:

     def foo(mutable=some_expr_producing_mutable):
         #rest of function

However, this does not work as intended, as 
'some_expr_producing_mutable' is evaluated only *once* at 
definition-time, rather than once per call at call-time.  This results 
in all calls to 'foo' using the same default value, which can result in 
unintended consequences.  This necessitates the previously mentioned 
idiom. This unintuitive behavior is such a frequent stumbling block for 
newbies that it is present in at least 3 lists of Python's problems [0] 
[1] [2].
     There are currently few, if any, known good uses of the current 
behavior of mutable default arguments.  The most common one is to 
preserve function state between calls.  However, as one of the lists [2] 
comments, this purpose is much better served by decorators, classes, or 
(though less preferred) global variables.
     Therefore, since the current semantics aren't useful for 
non-constant default values and an idiom is necessary to work around 
this deficiency, why not change the semantics so that people can write 
what they mean more directly, without the annoying boilerplate?


     Originally, it was proposed that all default argument values be 
deep-copied from the original (evaluated at definition-time) at each 
invocation of the function where the default value was required. 
However, this doesn't take into account default values that are not 
literals, e.g. function calls, subscripts, attribute accesses.  Thus, 
the new idea was to re-evaluate the default arguments at each call where 
they were needed. There was some concern over the possible performance 
hit this could cause, and whether there should be new syntax so that 
code could use the existing semantics for performance reasons.  Some of 
the proposed syntaxes were:

     def foo(bar=<baz>):

     def foo(bar=new baz):

     def foo(bar=fresh baz):

     def foo(bar=separate baz):

     def foo(bar=another baz):

     def foo(bar=unique baz):

where the new keyword (or angle brackets) would indicate that the 
parameter's default argument should use the new semantics.  Other 
parameters would continue to use the old semantics. It was generally 
agreed that the angle-bracket syntax was particularly ugly, leading to 
the proposal of the other syntaxes. However, having 2 different sets of 
semantics could be confusing and leaving in the old semantics just for 
performance might be premature optimization. Refactorings to deal with 
the possible performance hit are discussed below.


     The current semantics for default arguments are replaced by the 
following semantics:
     - Whenever a function is called, and the caller does not provide a
     value for a parameter with a default expression, the parameter's
     default expression shall be evaluated in the function's scope.  The
     resulting value shall be assigned to a local variable in the
     function's scope with the same name as the parameter.
     - The default argument expressions shall be evaluated before the
     body of the function.
     - The evaluation of default argument expressions shall proceed in
     the same order as that of the parameter list in the function's
Given these semantics, it makes more sense to refer to default argument
expressions rather than default argument values, as the expression is
re-evaluated at each call, rather than just once at definition-time.
Therefore, we shall do so hereafter.

Demonstrative examples of new semantics:
     #default argument expressions can refer to
     #variables in the enclosing scope...
     CONST = "hi"
     def foo(a=CONST):
         print a

     >>> foo()
     >>> CONST="bye"
     >>> foo()

     #...or even other arguments
     def ncopies(container, n=len(container)):
         return [container for i in range(n)]

     >>> ncopies([1, 2], 5)
     [[1, 2], [1, 2], [1, 2], [1, 2], [1, 2]]
     >>> ncopies([1, 2, 3])
     [[1, 2, 3], [1, 2, 3], [1, 2, 3]]
     >>> #ncopies grabbed n from [1, 2, 3]'s length (3)

     #default argument expressions are arbitrary expressions
     def my_sum(lst):
         cur_sum = lst[0]
         for i in lst[1:]: cur_sum += i
         return cur_sum

     def bar(b=my_sum((["b"] * (2 * 3))[:4])):
         print b

     >>> bar()

     #default argument expressions are re-evaluated at every call...
     from random import randint
     def baz(c=randint(1,3)):
         print c

     >>> baz()
     >>> baz()

     #...but only when they're required
     def silly():
         print "spam"
         return 42

     def qux(d=silly()):

     >>> qux()
     >>> qux(17)
     >>> qux(d=17)
     >>> qux(*[17])
     >>> qux(**{'d':17})
     >>> #no output because silly() never called because d's value was 
specified in the calls

     #Rule 3
     count = 0
     def next():
         global count
         count += 1
         return count - 1

     def frobnicate(g=next(), h=next(), i=next()):
         print g, h, i

     >>> frobnicate()
     0 1 2
     >>> #g, h, and i's default argument expressions are evaluated in 
the same order as the parameter definition

Backwards Compatibility

     This change in semantics breaks all code which uses mutable default 
argument values. Such code can be refactored from:

     def foo(bar=mutable):


     def stateify(state):
         def _wrap(func):
             def _wrapper(*args, **kwds):
                 kwds['bar'] = state
                 return func(*args, **kwds)
             return _wrapper
         return _wrap

     def foo(bar):


     state = mutable
     def foo(bar=state):


     class Baz(object):
         def __init__(self):
             self.state = mutable

         def foo(self, bar=self.state):

The changes in this PEP are backwards-compatible with all code whose 
default argument values are immutable, including code using the idiom 
mentioned in the 'Motivation' section. However, such values will now be 
recomputed for each call for which they are required. This may cause 
performance degradation. If such recomputation is significantly 
expensive, the same refactorings mentioned above can be used.

     In relation to Python 3.0, this PEP's proposal is compatible with 
those of PEP 3102 [3] and PEP 3107 [4]. Also, this PEP does not depend 
on the acceptance of either of those PEPs.

Reference Implementation

     All code of the form:

         def foo(bar=some_expr, baz=other_expr):

     Should act as if it had read (in pseudo-Python):

         def foo(bar=_undefined, baz=_undefined):
             if bar is _undefined:
                 bar = some_expr
             if baz is _undefined:
                 baz = other_expr

where _undefined is the value given to a parameter when the caller 
didn't specify a value for it. This is not intended to be a literal 
translation, but rather a demonstration as to how Python's internal 
argument-handling machinery should be changed.


     [0] 10 Python pitfalls

     [1] Python Gotchas

     [2] When Pythons Attack

     [3] Keyword-Only Arguments

     [4] Function Annotations

More information about the Python-ideas mailing list