
On Thu, Nov 4, 2021 at 5:28 AM Stephen J. Turnbull <stephenjturnbull@gmail.com> wrote:
And that's what happens when you need to be pedantically correct. Not particularly useful to a novice, especially with the FUD at the end.
I have no idea what you mean by "that's what happens," except that apparently you don't like it.
What I mean is that pedantically correct language inevitably ends up way too verbose to be useful in an educational context. (Please explain the behaviour of "yield from" in a generator. Ensure that you are absolutely precisely correct.)
As I see it, a novice will know what a function definition is, and where it is in her code. She will know what a function call is, and where it is in her code. She will have some idea of the order in which things "get done" (in Python, "executed", but she may or may not understand that def is an executable statement).
Given the number of people who assume that function definitions are declarations, it's clear that some things simply have to be learned.
She can see the items in question, or see that they're not there when the argument is defaulted. To me, that concrete explanation in terms of the code on the screen will be more useful *to the novice* than the more abstract "when needed".
As for the "FUD", are you implying that you agree with Steve's proposed text? So that if the programmer is unsure, it's perfectly OK to use early binding, no bugs there?
If the programmer is unsure, go ahead and pick something, then move on. It's better to just try something and go than to agonize over which one you should use. Tell people that something is crucially important to get right, and they're more likely to be afraid of it. Give them a viable default (pun partly intended) and it's far less scary.
There's a performance cost to late binding when the result will always be the same. The compiler could optimize "=>constant" to "=constant" at compilation time [1], but if it's not actually a compile-time constant, only a human can know that that can be done.
We're talking about folks new to the late-binding syntax and probably to Python who are in doubt. I don't think performance over correctness is what we want to emphasize here.
Maybe, but there's also a lot of value in defaulting to the fast option. For instance, in a lot of situations, these two will behave identically: for key in some_dict: for key in list(some_dict): We default to iterating over the object itself, even though that could break if you mutate the dict during the loop. The slower and less efficient form is reserved for situations where it matters. That said, it might be better in this case to recommend late-binding by default. But most importantly, either recommended default is better than making things sound scary.
But then the question becomes: should we recommend late-binding by default, with early-binding as an easy optimization? I'm disinclined to choose at this point, and will leave that up to educators and style guide authors.
I think most educators will go with "when in doubt, ask a mentor if available, or study harder if not -- anyway, you'll get it soon, it's not that hard", or perhaps offer a much longer paragraph of concrete advice. Style guide authors should not touch it, because it's not a matter of style when either choice is a potential bug.
I would initially just recommend early-binding by default, since it's going to have better cross-version compatibility. By the time that's no longer a consideration, I personally, and the world in general, will have a lot more experience with the feature, so we'll be able to make a more informed decision.
You say that there's a deficiency with a generic deferred, in that in your plan late-bound argument expressions can access nonlocals -- but that's not a problem if the deferred is implemented as a closure. (This is an extension of a suggestion by Steven d'Aprano.) I would expect that anyway if generic deferreds are implemented, since they would be objects that could be returned from functions and "ejected" from the defining namespace.
If the deferred is implemented as a closure, it would be useless for this proposal. Look at the clunky proposals created to support the bisect example, and the weird messes to do things that, with a little bit of compiler support, are just ordinary variable references.
It seems to me that access to nonlocals is also a problem for functions with late-bound argument defaults, if such a function is returned as the value of the defining function. I suppose it would have to be solved in the same way, by making that function a closure. So the difference is whether the closure is in the default itself, or in the function where the default is defined. But the basic issue, and its solution, is the same. That might be "un-Pythonic" for deferreds, but at the moment I can't see why it's more un-Pythonic for deferreds than for local functions as return values.
Not sure what you mean, but the way I've implemented it, if you refer to a nonlocal in a late-bound default, it makes a closure just the same as if you referred to it in the function body. This is another reason that generic deferreds wouldn't work, since the compiler knows about the late default while compiling the *parent* function (and can thus make closure cells as appropriate). ChrisA