[Python-ideas] proto-PEP: Fixing Non-constant Default Arguments
Roman Susi
rnd at onego.ru
Mon Jan 29 08:38:39 CET 2007
Hello!
I'd liked to say outright that this bad idea which complicates matters
more than provides solutions.
Right now it is enough to know that the part from def to ":" is executed
at definition time. This is what
incremental dynamic semantics is about. So, the suggestion is good only
as separated feature, but is IMHO wrong
if considered in the language design as a whole.
So things like
def foo(non_const=None):
non_const = non_const or []
are good becuase explicitely tell you that the mutable object is to be
created at call-time, not def-time.
And I do not like PEP 3107 neither: its overly complex.
If there is a need for Python type checking, I'd suggested to make a
special superset which could be used
to write compiled extensions as well (Pyrex comes to mind).
Regards,
Roman
P.S. However, I may be wrong. In that case my syntax suggestion would be
this:
def foo(non_const or []):
...
where [] is executed at runtime BECAUSE at def time non_const is somehow
True and that is enough to leave [] alone.
I have not checked, but I believe it is backward compatible.
Anyway, could you summarize both contr-argument and this syntax proposal
in the PEP?
Chris Rebert wrote:
>The following is a proto-PEP based on the discussion in the thread
>"fixing mutable default argument values". Comments would be greatly
>appreciated.
>- Chris Rebert
>
>Title: Fixing Non-constant Default Arguments
>
>Abstract
>
> This PEP proposes new semantics for default arguments to remove
>boilerplate code associated with non-constant default argument values,
>allowing them to be expressed more clearly and succinctly.
>
>
>Motivation
>
> Currently, to write functions using non-constant default arguments,
>one must use the idiom:
>
> def foo(non_const=None):
> if non_const is None:
> non_const = some_expr
> #rest of function
>
>or equivalent code. Naive programmers desiring mutable default arguments
>often make the mistake of writing the following:
>
> def foo(mutable=some_expr_producing_mutable):
> #rest of function
>
>However, this does not work as intended, as
>'some_expr_producing_mutable' is evaluated only *once* at
>definition-time, rather than once per call at call-time. This results
>in all calls to 'foo' using the same default value, which can result in
>unintended consequences. This necessitates the previously mentioned
>idiom. This unintuitive behavior is such a frequent stumbling block for
>newbies that it is present in at least 3 lists of Python's problems [0]
>[1] [2].
> There are currently few, if any, known good uses of the current
>behavior of mutable default arguments. The most common one is to
>preserve function state between calls. However, as one of the lists [2]
>comments, this purpose is much better served by decorators, classes, or
>(though less preferred) global variables.
> Therefore, since the current semantics aren't useful for
>non-constant default values and an idiom is necessary to work around
>this deficiency, why not change the semantics so that people can write
>what they mean more directly, without the annoying boilerplate?
>
>
>Rationale
>
> Originally, it was proposed that all default argument values be
>deep-copied from the original (evaluated at definition-time) at each
>invocation of the function where the default value was required.
>However, this doesn't take into account default values that are not
>literals, e.g. function calls, subscripts, attribute accesses. Thus,
>the new idea was to re-evaluate the default arguments at each call where
>they were needed. There was some concern over the possible performance
>hit this could cause, and whether there should be new syntax so that
>code could use the existing semantics for performance reasons. Some of
>the proposed syntaxes were:
>
> def foo(bar=<baz>):
> #code
>
> def foo(bar=new baz):
> #code
>
> def foo(bar=fresh baz):
> #code
>
> def foo(bar=separate baz):
> #code
>
> def foo(bar=another baz):
> #code
>
> def foo(bar=unique baz):
> #code
>
>where the new keyword (or angle brackets) would indicate that the
>parameter's default argument should use the new semantics. Other
>parameters would continue to use the old semantics. It was generally
>agreed that the angle-bracket syntax was particularly ugly, leading to
>the proposal of the other syntaxes. However, having 2 different sets of
>semantics could be confusing and leaving in the old semantics just for
>performance might be premature optimization. Refactorings to deal with
>the possible performance hit are discussed below.
>
>
>Specification
>
> The current semantics for default arguments are replaced by the
>following semantics:
> - Whenever a function is called, and the caller does not provide a
> value for a parameter with a default expression, the parameter's
> default expression shall be evaluated in the function's scope. The
> resulting value shall be assigned to a local variable in the
> function's scope with the same name as the parameter.
> - The default argument expressions shall be evaluated before the
> body of the function.
> - The evaluation of default argument expressions shall proceed in
> the same order as that of the parameter list in the function's
> definition.
>Given these semantics, it makes more sense to refer to default argument
>expressions rather than default argument values, as the expression is
>re-evaluated at each call, rather than just once at definition-time.
>Therefore, we shall do so hereafter.
>
>Demonstrative examples of new semantics:
> #default argument expressions can refer to
> #variables in the enclosing scope...
> CONST = "hi"
> def foo(a=CONST):
> print a
>
> >>> foo()
> hi
> >>> CONST="bye"
> >>> foo()
> bye
>
> #...or even other arguments
> def ncopies(container, n=len(container)):
> return [container for i in range(n)]
>
> >>> ncopies([1, 2], 5)
> [[1, 2], [1, 2], [1, 2], [1, 2], [1, 2]]
> >>> ncopies([1, 2, 3])
> [[1, 2, 3], [1, 2, 3], [1, 2, 3]]
> >>> #ncopies grabbed n from [1, 2, 3]'s length (3)
>
> #default argument expressions are arbitrary expressions
> def my_sum(lst):
> cur_sum = lst[0]
> for i in lst[1:]: cur_sum += i
> return cur_sum
>
> def bar(b=my_sum((["b"] * (2 * 3))[:4])):
> print b
>
> >>> bar()
> bbbb
>
> #default argument expressions are re-evaluated at every call...
> from random import randint
> def baz(c=randint(1,3)):
> print c
>
> >>> baz()
> 2
> >>> baz()
> 3
>
> #...but only when they're required
> def silly():
> print "spam"
> return 42
>
> def qux(d=silly()):
> pass
>
> >>> qux()
> spam
> >>> qux(17)
> >>> qux(d=17)
> >>> qux(*[17])
> >>> qux(**{'d':17})
> >>> #no output because silly() never called because d's value was
>specified in the calls
>
> #Rule 3
> count = 0
> def next():
> global count
> count += 1
> return count - 1
>
> def frobnicate(g=next(), h=next(), i=next()):
> print g, h, i
>
> >>> frobnicate()
> 0 1 2
> >>> #g, h, and i's default argument expressions are evaluated in
>the same order as the parameter definition
>
>
>Backwards Compatibility
>
> This change in semantics breaks all code which uses mutable default
>argument values. Such code can be refactored from:
>
> def foo(bar=mutable):
> #code
>
>to
>
> def stateify(state):
> def _wrap(func):
> def _wrapper(*args, **kwds):
> kwds['bar'] = state
> return func(*args, **kwds)
> return _wrapper
> return _wrap
>
> @stateify(mutable)
> def foo(bar):
> #code
>
>or
>
> state = mutable
> def foo(bar=state):
> #code
>
>or
>
> class Baz(object):
> def __init__(self):
> self.state = mutable
>
> def foo(self, bar=self.state):
> #code
>
>The changes in this PEP are backwards-compatible with all code whose
>default argument values are immutable, including code using the idiom
>mentioned in the 'Motivation' section. However, such values will now be
>recomputed for each call for which they are required. This may cause
>performance degradation. If such recomputation is significantly
>expensive, the same refactorings mentioned above can be used.
>
> In relation to Python 3.0, this PEP's proposal is compatible with
>those of PEP 3102 [3] and PEP 3107 [4]. Also, this PEP does not depend
>on the acceptance of either of those PEPs.
>
>
>Reference Implementation
>
> All code of the form:
>
> def foo(bar=some_expr, baz=other_expr):
> #body
>
> Should act as if it had read (in pseudo-Python):
>
> def foo(bar=_undefined, baz=_undefined):
> if bar is _undefined:
> bar = some_expr
> if baz is _undefined:
> baz = other_expr
> #body
>
>where _undefined is the value given to a parameter when the caller
>didn't specify a value for it. This is not intended to be a literal
>translation, but rather a demonstration as to how Python's internal
>argument-handling machinery should be changed.
>
>
>References
>
> [0] 10 Python pitfalls
> http://zephyrfalcon.org/labs/python_pitfalls.html
>
> [1] Python Gotchas
> http://www.ferg.org/projects/python_gotchas.html#contents_item_6
>
> [2] When Pythons Attack
> http://www.onlamp.com/pub/a/python/2004/02/05/learn_python.html?page=2
>
> [3] Keyword-Only Arguments
> http://www.python.org/dev/peps/pep-3102/
>
> [4] Function Annotations
> http://www.python.org/dev/peps/pep-3107/
>_______________________________________________
>Python-ideas mailing list
>Python-ideas at python.org
>http://mail.python.org/mailman/listinfo/python-ideas
>
>
>!DSPAM:45bcf8295417644621095!
>
>
>
More information about the Python-ideas
mailing list