[Python-ideas] proto-PEP: Fixing Non-constant Default Arguments
Chris Rebert
cvrebert at gmail.com
Tue Jan 30 06:09:29 CET 2007
Jim Jewett wrote:
> On 1/28/07, Chris Rebert <cvrebert at gmail.com> wrote:
>
>> Naive programmers desiring mutable default arguments
>> often make the mistake of writing the following:
>
>> def foo(mutable=some_expr_producing_mutable):
>> #rest of function
>
> Yes, it is an ugly gotcha, but so is the alternative.
>
> If it is changed, then just as many naive programmers (though perhaps
> not exactly the same ones) will make the opposite mistake -- and so
> will some experienced programmers who are used to current semantics.
Anyone (ab)using the current semantics with the above construct is or
ought to be aware that such usage is unpythonic. Also, due to the
hairiness/obscurity of the construct, hardly anyone uses it, so the
impact of removing it should be relatively minor. Additionally, I see
making the opposite mistake as usually only maybe causing a performance
problem, as opposed to causing the program to work incorrectly. As
discussed in my proposal, this might be premature optimization. If it is
a serious issue (which hasn't been indicated by the discussion so far),
we can add syntax for the old/new semantics, which I also mentioned in
the proposal.
>> There are currently few, if any, known good uses of the current
>> behavior of mutable default arguments. The most common one is to
>> preserve function state between calls. However, as one of the lists [2]
>> comments, this purpose is much better served by decorators, classes, or
>> (though less preferred) global variables.
>
> I disagree. This is particularly wrong for someone coming from a
> functional background.
I assume you disagree with the "purpose is much better served by
decorators, classes, or local variables" part as opposed to the "default
mutable arguments with current semantics have few good uses" part.
Please correct me if I'm in error.
> A class plus an instantiation seems far too heavyweight for what ought
> to be a simple function. I'm not talking (only) about the runtime;
> the number of methods and lines of code is the real barrier for me.
> I'll sometimes do it (or use a global) anyhow, but it feels wrong. If
> I had felt that need more often when I was first learning python (or
> before I knew about the __call__ workaround), I might have just
> written the language off as no less bloated than java.
I'm sorry, but when you have a function sharing state between calls,
that just screams to me that you should make it a method of an object so
you don't have to store the state in some roundabout way, such as in
mutable default arguments. If performance is your concern, the decorator
version might perform better (I don't really know), or in the extreme
case, you could use a global variable, which is definitely faster. Speed
and elegance often come in inverse proportions. Also, I've revised the
refactoring in question so that you no longer need to instanciate the
class, which at least makes it marginally better.
> You see the problem with globals, but decorators are in some sense
> worse -- a function cannot see its own decorations. At best, it can
> *assume* that repeating its own name (despite Don't Repeat Yourself)
> will get another reference to self, but this isn't always true.
I really don't see how the decorator in the PEP is any worse than other
decorators in this regard. The problem you describe currently applies to
all decorated functions, though functools.wraps might help mitigate this
situation.
> Programmers used to creating functions outside of toplevel (or
> class-level) will be more aware of this, and see the suggestion as an
> indication that python is inherently buggy.
>
>> def foo(bar=new baz):
>> #code
>
> This would be less bad.
>
> That said, I fear many new programmers would fail to understand when
> they needed new and when they didn't, so that in practice, it would be
> just optional random noise.
This is part of the reason I'm trying to avoid adding new syntax.
However, I assert that at least 'new' is clearer than the' x=None; if x
is None: x=expr' idiom in that it expresses one's intent more clearly.
Also, this would at least be a prettier way to spell the idiom even if
the reason still needed explaining. Alternatively, we could make the new
semantics the default and have syntax to use the old semantics via
'once' or some other keyword. This does nicely emphasize that such
semantics would be purely for optimization reasons. I think I'll add
this to the PEP.
>> Demonstrative examples of new semantics:
>> #default argument expressions can refer to
>> #variables in the enclosing scope...
>> CONST = "hi"
>> def foo(a=CONST):
>> print a
>
> This would work if there were any easy way to create a new scope. In
> Lisp, it makes sense. In python, it would probably be better to just
> find a way for functions to refer to their own decorations reliably.
This is outside of the scope of my PEP. However, the below improvement
should help and you could always use one of the other refactorings to
work around this issue.
>> Backwards Compatibility
>>
>> This change in semantics breaks all code which uses mutable default
>> argument values. Such code can be refactored from:
>>
>> def foo(bar=mutable):
>> #code
>>
>> to
>
> [a decorator option uses 3 levels of nested functions, which aren't even
> generic enough for reuse, unless you give up introspection.]
>
> No. It could be done with a (more complex) decorator, if that
> decorator came with the stdlib (probably in functools)
Agreed, the decorator could be better. I've just enhanced it using
functools.wraps, which should help with the introspection issues you
raise. Other improvements to the refactoring code in the PEP are welcomed.
- Chris Rebert
More information about the Python-ideas
mailing list