After following this thread I'm still not convinced changing the for
loop is really a good idea. I basically agree with Steven D'Aprano that
the supposed benefits of "protecting" against accidental name shadowing
are marginal at best and don't justify complicating matters.
def f(a, b, c) with x, y:
# x and y here have whatever values they had in the enclosing scope at definition time
I don't *entirely* hate this. But it still feels like new syntax for something that is perfectly good as pure convention:
def f(a, b, c=21, _foo=foo, _bar=bar):
# "capture" foo and bar from enclosing scope
foo, bar = _foo, _bar # optionally
Of course I know that users COULD call `f(1, 2, _foo=something_else)`. But we have asked them not to politely, and "we're all consenting adults."
What feels like a VASTLY more important issue is that foo and bar might have mutable types. Quite possibly, the underlying objects have changed before the calling point. That can be good or bad, but it's definitely something folks stumble over.
On the other hand, I know it perfectly well, and the number of times I've written this is small:
def g(a, b, _foo=deepcopy(foo)):
The dead increasingly dominate and strangle both the living and the
not-yet born. Vampiric capital and undead corporate persons abuse
the lives and control the thoughts of homo faber. Ideas, once born,
become abortifacients against new conceptions.