[Python-ideas] proto-PEP: Fixing Non-constant Default Arguments
Jim Jewett
jimjjewett at gmail.com
Mon Jan 29 23:51:04 CET 2007
On 1/28/07, Chris Rebert <cvrebert at gmail.com> wrote:
> Naive programmers desiring mutable default arguments
> often make the mistake of writing the following:
> def foo(mutable=some_expr_producing_mutable):
> #rest of function
Yes, it is an ugly gotcha, but so is the alternative.
If it is changed, then just as many naive programmers (though perhaps
not exactly the same ones) will make the opposite mistake -- and so
will some experienced programmers who are used to current semantics.
In a dynamic language, it makes perfect sense to reevaluate the entire
call signature with each call -- but in almost any language, it makes
perfect sense for the signature to be a constant. Usually, it doesn't
matter which you assume, which is why this problem doesn't get ironed
out in the first few minutes of writing python.
> There are currently few, if any, known good uses of the current
> behavior of mutable default arguments. The most common one is to
> preserve function state between calls. However, as one of the lists [2]
> comments, this purpose is much better served by decorators, classes, or
> (though less preferred) global variables.
I disagree. This is particularly wrong for someone coming from a
functional background.
A class plus an instantiation seems far too heavyweight for what ought
to be a simple function. I'm not talking (only) about the runtime;
the number of methods and lines of code is the real barrier for me.
I'll sometimes do it (or use a global) anyhow, but it feels wrong. If
I had felt that need more often when I was first learning python (or
before I knew about the __call__ workaround), I might have just
written the language off as no less bloated than java.
You see the problem with globals, but decorators are in some sense
worse -- a function cannot see its own decorations. At best, it can
*assume* that repeating its own name (despite Don't Repeat Yourself)
will get another reference to self, but this isn't always true.
Programmers used to creating functions outside of toplevel (or
class-level) will be more aware of this, and see the suggestion as an
indication that python is inherently buggy.
> def foo(bar=new baz):
> #code
This would be less bad.
That said, I fear many new programmers would fail to understand when
they needed new and when they didn't, so that in practice, it would be
just optional random noise.
> Demonstrative examples of new semantics:
> #default argument expressions can refer to
> #variables in the enclosing scope...
> CONST = "hi"
> def foo(a=CONST):
> print a
This would work if there were any easy way to create a new scope. In
Lisp, it makes sense. In python, it would probably be better to just
find a way for functions to refer to their own decorations reliably.
> Backwards Compatibility
>
> This change in semantics breaks all code which uses mutable default
> argument values. Such code can be refactored from:
>
> def foo(bar=mutable):
> #code
>
> to
[a decorator option uses 3 levels of nested functions, which aren't even
generic enough for reuse, unless you give up introspection.]
No. It could be done with a (more complex) decorator, if that
decorator came with the stdlib (probably in functools), but that is
heavy backwards-incompatibility to change a corner case (most defaults
aren't mutable) in a taste-dependent manner.
-jJ
More information about the Python-ideas
mailing list