On Fri, 27 Nov 2020 13:06:56 +1300 Greg Ewing firstname.lastname@example.org wrote:
On 27/11/20 12:11 am, Paul Sokolovsky wrote:
On Thu, 19 Nov 2020 18:53:01 +1300 Greg Ewing email@example.com wrote:
Note that this is *not* the same as introducing a new scope.
And that's very sad. That means instead of solving the root of the problem, adhoc workarounds are proliferated.
I'd be open to the idea of a variation on the for loop that does introduce a new scope for the loop variable. It would need new syntax, though.
Yes, that's what we talked about with "for let" (stroked thru) and "for const" ideas.
Before responding, I decided to try to eat my own dogfood and prototype block-level scoping for "for". Good news is that I was able to hack up something. It still needs cleanup before posting though. (As a sneak preview, it's *of course* not written in C.)
I didn't suggest it initially because proposals for subscopes within functions typically meet with a lot of resistance, and I didn't want to get sidetracked arguing about that when it isn't strictly needed.
And I'm here because seemingly unrelated topics quite visibly converge on the (fundamental) topics of (finer-grained) scoping and const'ness. E.g. the previous one was just a couple of weeks ago in relation to pattern matching: https://firstname.lastname@example.org/message/MUCUIUF5... (I kinda dropped the ball with answering it, but now hope to).
Here again they both come up in relation to the issues with "for". It's time maybe to make following observations:
1. Python tried hard to avoid block-level scoping altogether. 2. It failed, the cases of comprehension and "except" block are the witnesses. 3. Those cases were implemented in adhoc way (in one, a whole new function scope and call to it introduced, in another, there's a flat-world emulation with just del'ing a variable without caring that it kills the variable in the overall scope). 4. On the other hand, block-level scoping is well-known, well-explored, and popular scoping discipline in other programming languages, with solid formal model behind it. (That being that each block is well, a function in lambda-calculus. But that's conceptual model, the implementation can be more efficient). 5. Maybe it's time to embrace proper block-level scoping as means to address Python semantic issues (up to enabling its explicit and generic usage)?
Million cells for million of iterations?
If your million iterations create a million nested functions that capture the variable and are all kept alive, then yes, you will get a million cells along with your million function objects.
If the loop doesn't have any nested functions, or they don't capture the loop variable, there will be 0 cells.
That's good answers for the "coding time". (I can even continue: "... And if you create inner functions in a loop, you probably don't run it million of times. And if you do, there's already plenty of overhead due to allocation of such an inner function object anyway").
But what I argue for is to not rush with coding yet another adhoc workaround, like already were for comprehensions/excepts, but spend more time in the "design" phase, trying to find a fundamental solution which could cover many cases.
Note that the cells would still be needed even if there were a nested scope.
I'm not sure if there was a better word to use somewhere in that phrase.
Anyway, cells are needed *only* because the default semantics allows mutability of variables (and this semantics makes good sense for imperative language). But if we know that some variable is immutable, we don't need a cell, we can capture the *value* of the variable in the closure.
That's what "for const var in ..." (and "const var = ...") syntax expresses.
I'd suggest that it should be "for let"
That makes no sense as a phrase in English.
Was covered in other replies.
the scoping should be optimized to always treat "x" (as the "for" control variable) by-value,
That would make it very different from the way closures work anywhere else in the language.
But that's exactly the main point of what I'm talking about. We should not go for random syntactic/semantic patchings here and there. Instead, we should consider what fundamental notions are missing in Python semantics, and that's:
1. Block-level scope. 2. Immutability
- and then see how to apply them in general way to the language as a whole (in a sense "to allow them used 'everywhere'", not "force their usage on everyone").
In this regard:
--- def foo(): for const x in seq: f = lambda: print(x) ---
--- def foo(): const x = 1 f = lambda: print(x) ---
- will both capture "x" by value. In other words, "const" will consistently work everywhere in the language.
for const x in some_iterator: ...
That will make it obvious that "x" is captured by value,
for const x in some_iterator: ... x = 1
Doesn't make sense (should lead to error - assigning to a const).
This is mixing up two different issues -- the scope of the loop variable, and whether it can be assigned to.
There was no mix-up on my side, and neither seems there was on yours. Block-level scoping and const'ness are orthogonal, well composable features. In the case of "for" they exactly play well together.
I proposed "new" because it directly gets at the important thing, which is that x is a *different* x each time round the loop.
So, there shouldn't be "special syntax". There should be generic syntax to solve Python's block scoping and const'ness issues.
Which people would then have to remember to use. By "special" in that context I mean "something you have to do differently to get your for loop to work the way you expect". Not necessarily something that can only be used with for loops.
The way to not make it hard for people to remember is to make the features generic, fundamental, reusable, composable. "for const" on its own doesn't make much sense (and neither "for new", i.e. adhoc workaround for the "for" case). Implementation-wise, "for const" might be a pilot, but the general vision should be the availability of "const" (and eventually, block scoping ;-)) in the language in general.