My 2¢ (perhaps it should be 3¢ as I've already contributed 2¢).
Chris A did ask
"do Python core devs agree with less-skilled Python programmers
on the intuitions?"
so putting myself firmly in the second camp (though I have been
using Python for over a decade) here are my thoughts in case they
have some slight value.
Again, +1 on the PEP.
The absence of late-binding argument defaults is a gap in Python.
Whether it is a serious enough gap to warrant plugging it is of
course a matter of opinion. IMO most people find late binding more
natural (and probably more useful) than early binding. Witness the
number of Stack Overflow questions about it. Yes, there would be
more questions asking what the difference is, if late binding were
provided, but hey, people have to learn to use the tools in their
box.
Syntax bikeshedding: I still favour
var := expr
IMO the similarity to early binding syntax is a good thing (or at
least not a bad thing). Just as the walrus operator is similar to
`=` - after all they are both a form of assignment. As is `=` in a
function signature. I see no need to add a new symbol.
I don't like a keyword (hard or soft). It's verbose. It's
unnecessary. And if it's `defer`, I find it too reminiscent of
Twisted's deferreds (which I always have trouble getting my head
round, although I've used them many times), suggesting that the
expression is actually a thunk, or some async feature, or something
else weird and wonderful.
I don't think argument defaults should be allowed to refer to later
arguments (or of course the current argument). That's making the
interpreter's task too complicated, not to mention (surely?)
inefficient. And it's confusing. And it's a restriction which
could be removed later if desirable. (Actually I think I have
written code where either, but not both, of 2 arguments were
mandatory, but I can't recall the details. I can live with having
to code this explicitly, using arg=None or some such.)
I don't think it's a huge deal whether attempting it causes a
SyntaxError or a runtime (UnboundLocalError?) error, though if it
can be a SyntaxError that's obviously quicker to debug. Although as
Steven said:
Why would this be a "hard-to-track-down" bug? You get an
UnboundLocalError telling you exactly what the problem
is.
UnboundLocalError: local variable 'b' referenced before
assignment
(and presumably the line number)
I don't think making it a SyntaxError is 100% "breaking new
ground" [contra Guido], as e.g.
def f():
x = x+1
global y
is not a SyntaxError, but if you change `y` to `x` it is.
I respectfully disagree with Marc-Andre Lemburg:
"Explicit is better than implicit" and this is too much
"implicit"
for my taste.
For simple use cases, this may save a few lines of code,
but as soon
as you end up having to think whether the expression will
evaluate to
the right value at function call time, the scope it gets
executed
in, what to do with exceptions, etc., you're introducing
too much
confusion with this syntax.
Example:
def process_files(processor,
files=>os.listdir(DEFAULT_DIR)):
(a) Why is a late-bound default any more implicit than an
early-bound default? Why is a late-bound default more confusing
than an early-bound default? Why should there be more confusion
over an early-bound default evaluated in the outer/global scope than
an late-bound default evaluated in the function scope?
(b) It's unfair to denigrate the proposal of late-bound defaults by
showing how it can be abused in an example where the default value
can vary wildly (and might not even be under the programmer's
control). Any feature can be abused. You always have the status
quo option of explicitly coding what you mean rather than using (any
kind of) defaults.
I agree with Chris A here:
Having a new category of function parameters would make
these calls
even more complicated. It also overemphasizes, in my
opinion, the
difference between ways that optional arguments are
provided with
their values.
though truth to tell this is mainly, as a Bear of Little Brain, the
existing categories with the `/` and '*' separators are quite enough
to confuse me already.
Of the two options given at some point in the thread by Chris A:
1) Arguments are defined left-to-right, each one
independently of each other
2) Early-bound arguments and those given values are
defined first,
then late-bound arguments
The first option is much easier to explain, but will
never give useful
results for out-of-order references (unless it's allowed
to refer to
the containing scope or something). The second is closer
to the "if x
is None: x = y + 1" equivalent, but is harder to explain.
I prefer 1). Easier to understand and debug in examples with
side-effects such as
def f(a := enter_codes(), b = assign_targets(), c :=
unlock_missiles(), d = FIRE()):
(not that this is something to be particularly encouraged).
Re Guido's suggestions:
Maybe you can't combine early and late binding defaults in
the same signature.
Or maybe all early binding defaults must precede all late
binding defaults.
I don't like the first. While it is always safer to forbid
something first and maybe allow it later, IMO mixing early and late
binding is something that will inevitably be wanted sooner or
later. And I hazard a guess that it wouldn't (much) simplify the
implementation to forbid it.
As to the second:
As far as I can see it would have the same effect as straight
L-to-R evaluation, except that it would allow a late-binding default
to refer to a subsequent early-binding default, e.g.
def f(a := b+1, b = b_default):
I don't feel strongly about this. But L-to-R is nice and
simple, and I would reverse the parameter order here to make it work
(and be more comprehensible).
Best wishes
Rob Cliffe