[RFC] "Strict execution mode" (TL;DR version)

Hello, Recently, there was a discussion of language-level (vs just stdlib-level, like currently done by "typing" module) "const" annotation. The discussion mentioned that the initial implementation would have the best-effort semantics, i.e. don't guarantee "full" constant semantics, effectively, in its baseline, being the same annotation as any other. There was immediately raised a question how a full, eventual implementation would look like, and the posted proposal showcases *one* of the options. This initial post provides "TL;DR" version, also know as "abstract". The following post has the full proposal narrative. ----- ## Abstract This proposal seeks to define "constant namespace slot" (variables/functions/methods) for Python, including both implicit and explicit (user-annotated) slots. Further, the proposal seeks to define necessary changes to language semantics to actually uphold the constant property (i.e. inability to redefine). To preserve as much as possible the benefits of Python's dynamic nature and metaprogramming it enables, the proposal introduces 2 distinct execution modes: "Import-time", which preserves full existing Python execution semantics (albeit with some warnings to point at "problem spots"), and "Run-time", where constant property of the designated name slots (including implicitly constant nameslots, like functions/methods) is enforced (attempts to assign to such slots lead to runtime exception). Overall, the new execution variant is named "strict mode". It is believed that these measures would allow to develop more structured and more robust Python code, especially in big codebases. At the same time, the underlying motivation for the strict mode is to ease the development of runtime optimizers for Python, in particular JIT compilers. Support infrastructure for the strict mode is defined: means to enable it, compatibility measures with standard execution mode. The strict mode is intended to be an opt-in feature, not replacing the default Python execution mode. Below is an example of strict mode program with comments explaining various features: ``` import mod # Leads to a warning: replacing (monkey-patching) a constant slot (function) with a variable. mod.func1 = 1 # Leads to a warning: replacing (monkey-patching) a constant slot (function). mod.func2 = lambda: None # Way to define a constant. my_cnst: const = 1 # Leads to a warning: replacing (monkey-patching) a constant slot. my_cnst: const = 2 glb1 = 100 def fun(): # Imports are not allowed at run-time import mod2 # But you can re-import module previously imported at import-time. import mod # RuntimeError my_cnst = 3 # RuntimeError mod.func2 = lambda x: 1 global glb1, new # RuntimeError: Cannot create new global nameslots at runtime. new = 1 # Nor can delete existing del glb1 # Cheats don't work globals()["new"] = 1 # Leads to a warning: replacing (monkey-patching) a constant slot (function). def fun(): pass # fun_var is a variable storing a reference to a function (can store ref # to another func). fun_var = fun # fun2 is an alias of fun fun2: const = fun # Run-time execution starts with this function. This clearly delineates # import-time from run-time: a module top-level code is executed at # import-time (including import statements, which execute top-level code # of other modules recursively). When that is complete, strict mode # interpreter switches to run-time mode (restrictions enabled) and # executes __main__(). def __main__(): fun() # This statement is not executed when program runs in strict mode. # It is executed when it is run in normal mode, and allow to have # the same startup sequence (execution of __main__()) for both cases. if __name__ == "__main__": __main__() ``` -- Best regards, Paul mailto:pmiscml@gmail.com

Hello, I see that code listing was partially garbled (code merged into some comments). It shouldn't be too bad to disambiguate it, but let me try to repost the code again: ``` import mod # Leads to a warning: replacing (monkey-patching) a constant slot (function) with a variable. mod.func1 = 1 # Leads to a warning: replacing (monkey-patching) a constant slot (function). mod.func2 = lambda: None # Way to define a constant. my_cnst: const = 1 # Leads to a warning: replacing (monkey-patching) a constant slot. my_cnst: const = 2 glb1 = 100 def fun(): # Imports are not allowed at run-time import mod2 # But you can re-import module previously imported at import-time. import mod # RuntimeError my_cnst = 3 # RuntimeError mod.func2 = lambda x: 1 global glb1, new # RuntimeError: Cannot create new global nameslots at runtime. new = 1 # Nor can delete existing del glb1 # Cheats don't work globals()["new"] = 1 # Leads to a warning: replacing (monkey-patching) a constant slot (function). def fun(): pass # fun_var is a variable storing a reference to a function (can store ref # to another func). fun_var = fun # fun2 is an alias of fun fun2: const = fun # Run-time execution starts with this function. This clearly delineates # import-time from run-time: a module top-level code is executed at # import-time (including import statements, which execute top-level code # of other modules recursively). When that is complete, strict mode # interpreter switches to run-time mode (restrictions enabled) and # executes __main__(). def __main__(): fun() # This statement is not executed when program runs in strict mode. # It is executed when it is run in normal mode, and allow to have # the same startup sequence (execution of __main__()) for both cases. if __name__ == "__main__": __main__() ``` On Tue, 1 Dec 2020 18:26:48 +0300 Paul Sokolovsky <pmiscml@gmail.com> wrote:
[] -- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 2, 2020 at 2:29 AM Paul Sokolovsky <pmiscml@gmail.com> wrote:
Wait, what? No. No no no. Please do not do ANYTHING like this. Having suffered under JavaScript's highly restrictive import system (and actually been glad for it, since the alternative is far worse), I do not want ANY version of Python to give up the full power of its import system, including the ability for a module to be imported only when it's actually needed. Imports inside functions allow a program to have optional dependencies, or dependencies that might be slow to load (eg numpy), and without that, even running your script with "--help" has to process every single import in the entire file. -1000. ChrisA

Hello, On Wed, 2 Dec 2020 02:39:38 +1100 Chris Angelico <rosuav@gmail.com> wrote:
Then it's luck that ALL versions and dialects of Python aren't under your control ;-).
But didn't you you already spotted a line which says that the strict mode also aspires to improve on the Python module practices? Under strict mode's firm but benevolent rule, there won't be slowly-loading modules any more. All imports will be fast. And modules which want to be slow will do that in their module.init() function.
-1000.
I also forgot to mention very important point in the intro: when you read this proposal, please don't think about "CPython". That for sure will send you the wrong vibes. Think about "Python". ;-)
ChrisA
-- Best regards, Paul mailto:pmiscml@gmail.com

On Tue, 1 Dec 2020 at 15:55, Paul Sokolovsky <pmiscml@gmail.com> wrote:
By which you mean "CPython and all other implementations" I assume? I'm also -1000 on this proposal, even if you just limit it to CPython. It's not at all clear what prompted this idea, but if you're suggesting "modify Python so we can make it faster" then I'd rather see a prototype implementation that demonstrated both the limitations *and* the improved performance. We shouldn't limit the core language based simply on speculative benefits that some implementation might be able to achieve. Paul

There exist TWO highly successful, widely used, JIT compilers for Python. PyPy and Numba. Neither one of them would have any use whatsoever for this constantness. Or if you believe otherwise, get a developer of one of those to comment so. JIT'd Python simply is not slow, even compared to compiled languages. Searching for maybe-possibly-someday optimizations while ignoring the actual speed paths, is silly. But I'll be moderate and only vote -100, 10x less negative than Paul Moore and Chris Angelico :-). On Tue, Dec 1, 2020 at 4:03 PM Paul Moore <p.f.moore@gmail.com> wrote:
-- The dead increasingly dominate and strangle both the living and the not-yet born. Vampiric capital and undead corporate persons abuse the lives and control the thoughts of homo faber. Ideas, once born, become abortifacients against new conceptions.

I think that what you want is another language, that already exists and it's RPython: https://rpython.readthedocs.io/en/latest/rpython.html See constants paragraph. RPython is used to create PyPy, not to limit normal Python programming :-)

Hello, On Tue, 1 Dec 2020 21:05:23 +0100 Marco Sulla <Marco.Sulla.Python@gmail.com> wrote:
I embarked to read RPython docs several times, and always found that I stop right at the top of that page, at the phrase "The exact definition is “RPython is everything that our translation toolchain can accept” :)". You know, I can very relate to that phrase. As a little guy, that's how I write my stuff - no proper docs, no nothing, and when somebody comes by asking what's supported or similar, the response is "RTFC!" (where "C" is "code"). But sorry, I just can't take that from a project which for a long time received non-trivial funding and which is supposed to serve as a base for other projects. You can compare that with my "strict mode" proposal where I try to spell out how the flaming thing works, even though the idea is banally simple. That said, that's my experiences with RPython. What are yours? What have you written with it? I'm especially interested in memory requirements. "Strict mode" is implemented in Pycopy, which can do more or less useful things in 16KB of heap, and the strict mode doesn't regress that much. How does RPython feel in 16KB? [] -- Best regards, Paul mailto:pmiscml@gmail.com

Hello, On Tue, 1 Dec 2020 19:09:28 +0000 David Mertz <mertz@gnosis.cx> wrote:
Of course not. What PyPy needed were European Commission grants, nothing else. Numba needed a lot of corporate backing for sure too, but they also needed LLVM bindings. So they picked some guy's module. Then they threw it away, because it was hard to maintain, and made their own. While LLVM has always shipped Python bindings. That's absolutely normal, but shows there's a lot of "random betting" happens even in "highly successful" projects. But most importantly, I'm not interested in survivorship bias. I'm not much interested to know how to write million-dollar JIT projects, because heck, that's known for decades. I'm interested to know how to NOT write "million-dollar" JIT projects, and why Unladen Swallow, Pyston failed. I'm also interested to know how to write JIT projects which do NOT cost millions of dollars. All that is quite an unusual hobby, you bet.
Yeah! It's just don't work for stuff you need in a way you need, and too bloated, so when you want to fix it, you can't.
Exactly interested in low-hanging optimizations. Much less interested in approaches like "we'll feed in some random crap, and LLVM will take care of it". So, hopefully, the motivation is clear - I'm doing this stuff, because it's so obvious thing to do, and the guys who got the same idea in 2001 or so, didn't seem to have left tangible artifacts beyond deadlocked PEPs.
But I'll be moderate and only vote -100, 10x less negative than Paul Moore and Chris Angelico :-).
[] -- Best regards, Paul mailto:pmiscml@gmail.com

Hello, On Tue, 1 Dec 2020 16:02:21 +0000 Paul Moore <p.f.moore@gmail.com> wrote:
No, I mean exactly what's written in the proposal: "This proposal seeks to introduce opt-in "strict execution mode" for Python language implementations interested in such a feature."
I'm also -1000 on this proposal, even if you just limit it to CPython.
It's not at all clear what prompted this idea,
The proposal is 32+K (whoa, am *I* scribbled all that?!), so I suspect somewhere in there it makes it clear(er).
Yeah, I'd like to see that too! I just calculated that if betting on myself, it may take years, or maybe I give up or switch over to something else, like most other people do. So, I decided to throw over the fence what I have now - the idea, more or less detailed "spec" for it, and even an implementation in a niche Python dialect, but exactly the one of the kind which may benefit from it. Just imagine that if someone wrote previously such a detailed spec, which I liked - I might implement it now. And if they actually even provided a sample implementation, I might now code changes for it in the compiler, and maybe even run a few tests to provide those performance figures which you and me so much would like! Getting thoughts like that, most people I know would reflect them on themselves, and I'm not an exception. So, I post whatever I have for peer review, and continue.
That's why it's introduced as an *opt-in* feature for *interested* implementations.
Paul
-- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 2, 2020 at 6:11 AM Paul Sokolovsky <pmiscml@gmail.com> wrote:
Maybe flip this around a bit -- it seems that RPython, from the PyPy project is an implementation of a "strict" version of Python. You mentioned that it was frustrating that it does not have a properly written (or any?) spec. but it DOES have an implementation, so maybe you can write the spec instead of the implementation :-) Honestly, I myself have MANY times decided to write my own thing, rather than take the time to figure out what someone else already wrote -- even though it usually takes longer and gets a worse result :-) I have not followed PyPy closely, but I suspect that the lack of a formal spec for Rpython is not laziness, but rather that the entire point of it is to support PyPy, so they want to be free to adjust it as needed, rather than sticking to a a spec (or constantly trying to maintain a spec) Nevertheless, there are probably some really good lessons in there, and it would be very interesting to see if (a version of) RPython could reasonably be used to directly write general purpose programs. Other than the fact that a lot of work has already been done on it, RPython has the advantage that (presumably) its restrictions are there because they have been shown to help performance. Another option would be to build on something like Cython -- taking advantage of the type specifications at run time, without pre-compiling the entire module. NOTE on that: back when there was a lot of discussion about standardizing type hints, I asked about making them work with, e.g. Cython, to get performance benefits. The answer at that time was that performance was NOT the point of type hinting -- i.e. it was designed explicitly to support Pyton's dynamic nature. So it seems adding things like you are proposing with an eye to performance is not really where the Python community wants to go. -CHB
-- Christopher Barker, PhD Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython

Hello, On Wed, 2 Dec 2020 09:53:07 -0800 Christopher Barker <pythonchb@gmail.com> wrote:
No, it's "restricted" version of Python, that's what "R" in RPython means. "Flip around" is very related however. In a sense, RPython and "strict mode" approaches are opposite: RPython starts with a simple subset of Python (and let you write a full Python in it), while "strict mode" takes a full Python and tries to take away from it as little as possible to achieve the desired effect (optimizing name lookups in this case). I hope you agree the difference in approaches is quite noticeable.
It definitely has some docs. Yes, I would say they're not ideal.
but it DOES have an implementation, so maybe you can write the spec instead of the implementation :-)
For that I would need to use RPython. I considered that circa 5 years ago, and of course explored it. I "liked" what I saw, sure. But I "wasn't happy" with what I saw. There seem to be hints that I may be making this "strict mode" thingy because I wasn't aware of PyPy/RPython. But actually it's the opposite: being aware of RPython and knowing enough about it is what causes me to explore a different direction.
I'm an open-source guy all along, and consider duplication of effort to be one of "mortal sins" of the open-source. Beyond that, I'm also a lazy dude who prefers to spend his days on the beach and not in the front of computer. Neither I ever was mad enough to "write my own thing". I wanted to write a small Python literally for decades, but instead studied what other people did/do. I for example rejected TinyPy as a base, due to its source code quality (but its spirit is very high, I consider my Pycopy to be spiritual successor of it). Then when MicroPython was announced on Kickstarter, I literally talked its author to open-source it a half-year earlier, and contributed to it for several years, literally having written a third of it. It's the same with the "strict mode". While there's a bunch of "original research" in it, it's all based on (or related to) ideas expressed by other people, which I made sure to have studied before proceeding to implementation (references are given in the proposal).
I fully agree, and fully understand that. Again, that's a reason to explore a different direction, not against it. ("That area is already covered by good people, let's look elsewhere.")
It's not, and that's disclosed right away. If anything, lack of "too beautiful docs" is related to PyPy's project desire to not make RPython a standalone, general-purpose dialect of Python. Nor it's comparable to a "normal" Python, usually presented as "C-like subset" of Python. (I'm writing this by memory and don't have references at hand, so specific terminology used by them may be different).
I remember that very well, and that's another point I don't agree with.
I never met a Python user who said something like "I want Python to be slow" or "I want Python to keep being slow", so we'll see how that goes.
-CHB
[] -- Best regards, Paul mailto:pmiscml@gmail.com

just one more note:
But many that might say "I don't want to make Python less flexible in order to gain performance" Of course no one one is going to reject an enhancement that improves performance if it has no costs. My thought on your idea is this: Yes, a more restricted (strict) version of Python that had substantially better performance could be very nice. But the trick here is that you are proposing a spec, hoping that it could be used to enhance performance. I suspect you aren't going to get very far (with community support) without an implementation that shows what the performance benefits really are. I'm just one random guy on this list, but my response is: "interesting, but show me how it works before you make anything official" -CHB ----- Christopher Barker, PhD Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython

On Sat, Dec 5, 2020, 3:03 PM Christopher Barker <pythonchb@gmail.com> wrote:
From another who like CHB is just random person on this list (but probably even more "random"), interested enough to have read the entire thread and the other thread, but not knowledgeable or competent enough to offer detailed comments that are going to be particularly helpful to anyone, I'd say this: If you could actually make a fully functioning python that is significantly faster by doing this, and it introduced this two-stage interpreter idea with a much more strict secondary stage, and did not require all kinds of additional syntax in the code to get the speed improvements (like cython for example), I'd think you really might have something that would help a lot of people see the benefits of a potential switch to a stricter paradigm of writing in an ostensibly dynamic language that nonetheless would now have to be written much more less dynamically when inside functions and class methods. It seems to me that if the speed increase is enough, it could be worth the decrease in flexibility, potentially. At least enough to support the existence of a second mode of python execution (whether that mode lives in cpython or not doesn't seem to much matter to me). However I think maybe a big big problem is probably going to be the lack of interest in very popular third party, and even standard, libraries to rewrite their code to fit a D1S2 (dynamic stage one, strict stage two) interpretation model. It seems likely many heavily used packages will simply be near totally broken for your strict interpreter, and many many others will need tweaking. So they will have to be rewitten, or at least tweaked, somehow. Maybe many could be rewritten automatically? I do not know. But I think you need to consider that you could get to the end of writing this thing and have it working perfectly with a major (10x? 50x?) speed improvement, and still have trouble getting people interested because you can't run code in, like, the enum or pathlib or functools library, or requests or numpy or something else. That would be a bummer. How do you see that problem getting solved?

Hello, On Sun, 6 Dec 2020 00:37:15 -0500 Ricky Teachey <ricky@teachey.org> wrote: []
That's exactly the interesting part, which would be interesting to discuss, with interested parties. Just to give an idea of my timeline: I coded the basic strict mode implementation for Pycopy in August last year. Then made another pass over in November last year. Before merging it to Pycopy mainline, I wanted to make sure it's viable with "general Python code". That's why for winter holidays 2019/2020 I coded up CPython pure-Python impl, https://github.com/pfalcon/python-strict-mode. Of course, I faced issues with CPython, went on to argue with CPython developers that they should fix their stuff, and then suddenly winter holidays were over. Fast forward to this November, I figure I'm not making progress. So I think "god cares about CPython software, I care about *my* software". I went to convert whatever codes I already had running in Pycopy to the strict mode and found it's not bad at all (fixed gazillion usability bugs with the strict mode, yeah). The spec, I started to write it, because after such a delay, my first reaction was literally the sane as @rosuav in his reply here on the list: "Wait, wtf we don't support dynamic module imports, I lllluuuuve dynamic module imports." So, I had to remind me why, and write it down this time. That's when open-source project get documentation - when the authors themselves find a need for it ;-). Bottom line, here's the biggest change I had to apply to my most mind-boggling dynamic-imports app: https://github.com/pfalcon/ScratchABlock/commit/ac2a9145ec8c05fe2be7c982d88a... The app allows to pass on the command line a dir name, which can be full of files, and then inside each file, there can be multiple module names to import. Whoa! Still, 25 lines to cover it. To see whether it's much or not, would need to compare what it would take me to do that in a static language. So, above I'm using Python as a kind of DSL for my app. In a static language, I would need to write a *real* DSL: all the lexer/parser/interpreter business. Not 25 lines at all. And in Python, I can pay 25 lines price to get rid of the most obnoxious Python misfeature comparing to a static language: blatantly inefficient namespace lookups. Again, I'd be only more interested to hear/see/tell more stories about that. Just need to start somewhere.
I don't see much of a problem at all. I see it the same way as e.g. Cython or Mypyc authors do: "to use this stuff, you need to change your Python code". So, what we need to compare is how much you need to change and what you get in return. The strict mode asks for rather modest changes comparing to the tools above. But neither it claims 10x-50x speed improvement. Actually, the idea behind the strict mode is not to make Python faster. It's to make Python *not slower*. In one particular area - name lookup (and then only static classes/functions mostly, but stay tuned for the strict mode part 2, where we brainstorm object method lookup). -- Best regards, Paul mailto:pmiscml@gmail.com

Hello, On Sat, 5 Dec 2020 12:02:52 -0800 Christopher Barker <pythonchb@gmail.com> wrote:
And I'd shake hands with them, because I add "strict mode" as an additional optional mode beyond the standard Python's mode. (I'd however expect that I personally use it often, because it's just a small notch above how I write programs in Python anyway.)
As I mentioned in previous replies, I fully agree that it would be nice to see performance figures. But sadly, as directly related to the strict mode, those aren't available yet. However, if the question is to explicate the idea further, that can be done on synthetic examples right away. Suppose we have a pretty typically-looking Python code like (condensed to save on vertical space): --- def foo(): a = 1; b = 2 for _ in range(10000000): c = min(a, b) foo() --- The problem with executing that code is that "min" per the standard Python semantics is looked up by name (and beyond that, the look up is two-level, aka "pretty complex"). Done 10 mln types in a loop, that's gotta be slow. Let's run it in a Python implementation which doesn't have existing means to optimize that "pretty complex" lookups, e.g. my Pycopy (btw, am I the only one who finds it weird that you can't pass a script to timeit?): $ pycopy -m timeit -n1 -r1 "import case1" 1 loops, best of 1: 2.41 sec per loop A common way to optimize global lookups (which are usually by name in overdynamic languages) is to cache the looked up value in a local variable (which aren't part of external interface, and thus are usually already optimized to be accessed by "stack slot"): --- def foo(): from builtins import min a = 1; b = 2 for _ in range(10000000): c = min(a, b) foo() --- $ pycopy -m timeit -n1 -r1 "import case3" 1 loops, best of 1: 551 msec per loop 4 times faster. So, the idea behind the strict mode is to be able to perform such an optimization automatically, without manual patchings like "from builtins import min" above. And the example above shows just the surface of it, for bytecode interpretation cases. But the strict mode reaches straight to the JITted machine code, where it allows to generate the same code for function calls as it would for C. The "code for function calls" is the keyword here. Of course, Python differs from C in more things that just name lookups. And most of these things are necessarily slower (and much harder to optimize). But the name lookups don't have to be, and the strict mode (so far) tries to improve just this one aspect. And it does that because it's simple to do, for very modest losses in Python expressivity (adjusted for real-world code sanity and maintainability). And it again does that to put a checkmark against it in move to the other things to optimize (or not).
It's nothing "official", it's completely grass-roots proposal for whoever may be interested in it. But I have to admit that I like it very much (after converting a few of my apps to it), and already treat it as unalienable part of the semantics of my Python dialect, Pycopy.
-CHB
[] -- Best regards, Paul mailto:pmiscml@gmail.com

On 2020-12-05 05:49, Paul Sokolovsky wrote:
The main thing I don't understand is what you expect to be the result of this proposal; or, in other words, what you expect anyone besides you to do with your idea. You say we shouldn't think in terms of CPython, that this is separate from CPython. You say you're also not interested in existing alternative Pythons or quasi-Pythons like Cython. So what is it you want? Some people here to say "Yes, I'll join you and work on a totally new restricted version of Python from scratch just to see what happens"? -- Brendan Barnwell "Do not follow where the path may lead. Go, instead, where there is no path, and leave a trail." --author unknown

Hello, On Sat, 05 Dec 2020 21:48:53 -0800 Brendan Barnwell <brenbarn@brenbarn.net> wrote:
Roughly, I expect it to be about the same as for e.g. https://www.python.org/dev/peps/pep-0509/ . The background motivation of the two is the same: look for ways to optimize name lookups. My however reaches "above surface" and requires some restructuring on the side of the Python programs, so potentially "affects" (as in: can elicit response) from wider audience than PEP509, which is clearly targeted at Python internals enthusiasts. Generally, as any human being, I'm interested to communicate, and potentially, even cooperate, with other alike-minded human beings. I don't pledge for the latter. I will understand even if nobody is truly interested in my idea. But it's my strong belief that there should be more people interested in this stuff (not just my particular proposal, but stuff related to conceptual, "meta", and on the hand of spectrum, implementation, matters in Python). So, I post to whoever may be lurking around, or who wanted to look at that stuff for a long, and bringing it up may be the "last straw" for them to actually dig into it. Receiving criticism from not-really-interested people is also helpful, but such a discussion quickly derails, as the experience shows. So, coming to specifics, some points which *could* be discussed: 1. The proposal makes claims that some of the restrictions imposed are already oftentimes imposed by codebases caring about their hygiene. It would be helpful to get (detailed enough, not low-effort) aye's or nay's. 2. For restrictions where the proposal goes beyond something which can be called "existing practices", how harsh are those, and what can be done about them? There's literally one (YMMV) "grave" thing in the proposal - prohibition of runtime imports. It's also outlined how to address them. So, if we get past Chris Angelico's "no no no no" for a curious walk, what do we see? As an example of possible p.1 discussion point, a case of global variable definition/declaration matter was brought up in another message. So, do you define your global variables at the module's top level? How do you feel about it? Is it: a) Are you nuts? How else can it be done? b) I do it. c) I don't do it. d) Defining the global variables at the global scope? Only over my dead body! -- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 2, 2020 at 2:53 AM Paul Sokolovsky <pmiscml@gmail.com> wrote:
If there are special-purpose cut-down implementations that provide a more restricted environment in return for some other feature (say, "being able to run in a web browser", or "being able to run on a microcontroller"), then that's fine, but that's not a language feature - that's a partial implementation that sacrifices the parts it can't afford to do. From the sound of your proposal, this should be part of the main Python language and be a normal expectation of code.
Still -1000 on that, partly because you can never guarantee that it'll be fast (if only because searching for and loading large numbers of files can be very expensive without any single one of them being blamed for the cost), and partly because two-phase initialization is something Python tries hard to avoid. You don't get code like this: f = file("/path/to/file") f.open() db = psycopg2.database("postgresql://user@localhost/dbname") db.connect() Two-phase initialization means that everything in that module (or class, as the case may be) has to first check if it's been initialized. Why not just initialize it when you create the object? Why not just initialize when you import the module? How much do you really gain by forcing this? You get stub modules everywhere that aren't initialized yet, but have to be imported just to satisfy this restriction. Current code simply... doesn't import it.
Yes, I'm thinking about Python, but unless you specifically tell me that this is a narrow special-purpose sublanguage, I'm going to assume you want CPython to at least respect this, even if it doesn't take advantage of it. ChrisA

Hello, On Wed, 2 Dec 2020 03:02:26 +1100 Chris Angelico <rosuav@gmail.com> wrote: []
Kinda yes, except not as vulgar as examples above. More like "being able to develop a JIT which doesn't cost million dollars" (extra points if that still can run on a microcontroller!).
The sacrifice is coming more from theoretical constraints on what's (easily) possible and what's not, not from vulgar constraints like "my microcontroller doesn't have space for that". How to be a dynamic language (addressing only dynamic name lookup here) and not suffer from that? How to have a cake and eat it? A simple answer (and we're interested in such here) is that you can't. First you have the cake - and heavy emphasis in the proposal is put on that, so you didn't start with just breadcrumbs. Then you eat the cake. To give the right idea of how to look at this stuff, you should compare this "strict mode" with Python compilers like Mypyc or Shedskin, and what restrictions they place.
From the sound of your proposal, this should be part of the main Python language and be a normal expectation of code.
I don't know where you got that sound from, because the very first sentence of the very first "Introduction" section reads: "This proposal seeks to introduce opt-in "strict execution mode" for Python language implementations interested in such a feature." So, is for example the CPython implementation interested? []
This proposal puts a heavy emphasis on a JIT usecase, and JIT is known to have its startup costs. JIT is also known to not be a panacea fro all usecases, why this proposal says that strict mode doesn't replace "normal" mode, some programs are just bound to be run in it.
and partly because two-phase initialization is something Python tries hard to avoid.
This proposal specifically introduces two-phase startup sequence, which does affect imports. Price of magic. Now, all that in detail discussed in the proposal: https://github.com/pycopy/PycoEPs/blob/master/StrictMode.md#dynamic-module-i... (yes, I put it in a repo to link to specific sections). Sneak peeks for you: "By far, the most "grave" issue with the strict mode is that dynamic (at run-time) imports are not supported" "Let us walk thru why" "To perform any non-trivial optimization on dynamic languages, whole-program analysis is required." "Don't get me wrong - I absolutely love dynamic imports! As an example, the strict mode was implemented in the Pycopy dialect of Python, and of 5 not completely trivial applications written for Pycopy, 5 use dynamic imports." "There is actually yet another option to tackle dynamic imports problem - to ease restrictions" So, I feel, and share, your pain ;-). []
I'd like to have a pure-python implementation running on CPython, yes. I'm not sure what you mean by "respect this". Any valid strict-mode program is also a valid normal-mode program. You can treat strict mode as a kind of type linter - it will pinpoint issues violating particular discipline (not totally insane), you fix them, and then can run without it too.
ChrisA
-- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 2, 2020 at 9:10 AM Paul Sokolovsky <pmiscml@gmail.com> wrote:
That's completely the opposite of what I was talking about, then. If the goal is to develop a JIT compiler, then it should surely be part of the main implementations of the language - the ones that are broadly compatible with each other. Cut-down Python implementations are NOT fully compliant with the language spec, and that's okay, because nobody expects them to be. If the restricted execution model is incompatible with most Python scripts, why would anyone bother to use it? Does it actually offer performance advantages better than PyPy, which JIT-compiles and is able to run normal Python scripts? ChrisA

Hello, On Wed, 2 Dec 2020 09:16:56 +1100 Chris Angelico <rosuav@gmail.com> wrote:
E.g. because someone who would want to experiment with JIT, would need to apply similar restrictions anyway.
As I mentioned in another response, it's all contrarian approach, right from the start. It's not whether PyPy offers performance advantages, it's whether it runs in the target environments I consider interesting at all. The answer is NO. I then consider what it does to NOT run in the environments I find interesting. It runs unmodified scripts, and throws unlimited amounts of memory at that. So, I see how to modify scripts in such a way to NOT throw unneeded memory at trivial things, while only improving the code hygiene. I get that it's hard to get ;-). Besides that, it also implements runtime support for "const" variables, which is closer to matters of CPython level. (E.g., if there's support for constants, CPython's pattern matching doesn't need to go for ugly workarounds of forcing to use "case Somewhere.SOMETHING", it can be just "case SOMETHING:").
ChrisA
-- Best regards, Paul mailto:pmiscml@gmail.com

Hello, On Wed, 2 Dec 2020 00:10:41 +0100 Marco Sulla <Marco.Sulla.Python@gmail.com> wrote:
That's my aspiration for the strict mode, yes. (I provide "full disclosure" on that fact.) Beyond that, strict mode, is well, strict. So, it may be interesting to people who want more "strictness" in Python. For example, some time ago, "type annotations" were introduced, and quite many people (though not everyone of course) aspire to make their programs more strict using them. The "strict mode" proposed here is similar, but explores different dimension of strictness.
Why can't this be done in a separate project, like PyPy or Pycopy?
Both (and many more around!) of those projects are Pythons. So, not only it can be, it should be done in as many projects as possible. (In which specific, is up to their maintainers.) [] -- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 2, 2020 at 6:20 PM Paul Sokolovsky <pmiscml@gmail.com> wrote:
Type annotations don't add any strictness. They just give information so that an external tool can choose to provide information to the coder. Python can't change its execution plans based on type annotations, because they have no actual runtime meaning; and the corollary of that is that type annotations are fairly safe. Annotating a function won't suddenly break it (assuming the annotation is syntactically valid). If constness can be used to actually change behaviour, then it really does have to restrict the language, and that's quite a different beast. ChrisA

Hello, On Wed, 2 Dec 2020 18:33:49 +1100 Chris Angelico <rosuav@gmail.com> wrote: []
Teh "strict mode" proposed here can be seen as such an external tool too. (It can be internal tool too.)
Python can't change its execution plans based on type
CPython can't, other Pythons can. Mypyc is a well-known Python which changes its execution plans based on type annotations. So, the culprit is the same: people continue to think Python == (current) CPython. We now seem to get to the point that most advanced people say "no, no, we understand the difference", but they continue to *think* that. They try to "size up" a change to their *CPython* experiences (slow-loading numpy, blah-blah), even though have been warned that way won't lead them anywhere. (They'll make a big circle and arrive to the conclusion "I can't use that with CPython right away", but that was the (implied) content of the initial warning). So again, you should not compare the "strict mode" with CPython. You should compare it with Mypy, Mypyc, Shedskin, Cython, pyflakes (nags you about things, and strict mode nags you too), etc. [] -- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 2, 2020 at 7:11 PM Paul Sokolovsky <pmiscml@gmail.com> wrote:
"Mypyc is (mostly) not yet useful for general Python development." So, it's not a fully compliant Python implementation. Is there any fully compliant Python that changes its behaviour based on type annotations? If not, "Python == CPython" isn't the problem here. ChrisA

On Wed, Dec 2, 2020 at 7:24 PM Chris Angelico <rosuav@gmail.com> wrote:
Forgot the citation: https://github.com/python/mypy/blob/master/mypyc/README.md ChrisA

Hello, On Wed, 2 Dec 2020 19:24:08 +1100 Chris Angelico <rosuav@gmail.com> wrote:
No worries, I don't claim that the strict mode is suitable for production use already. It's literally the first time a "full spec" (and still apparently subject to change") is posted and reference implementation is provided, beyond mentioning the idea here and there.
I'm not sure if you already smell it in the air, but the idea of a "fully compliant Python" is getting old (simply because it was there all this time, and people learned its drawbacks too). Various people are looking how to restrict it (e.g. apply *and* enforce type annotations), or put simple, look for "the good parts". I can't believe I use that term towards Python, as it has strong connotation of looking for gems in a total crap (e.g. https://www.goodreads.com/book/show/2998152-javascript for readers who are not in loop), but the recent decade showed that if applied consistently it can achieve even literally that effect. And just imagine what it can do if applied to a better-from-start language like Python! So, literally, more and more people are concentrating on a task of how to do better things with *good* Python programs, because how do to about *any* Python program is know for decades - just run it in a slow ugly interpreter, full of legacy [censored], commonly known as "CPython". Definition of "good" is still being sought for, and likely will vary from faction to faction. For example, there're people who seriously think that a "good Python program" is the one littered up to a sizable part of its content with ugly-looking type annotations of the current generation (already legacy, as pre-PEP563). My proposal gives an alternative example of what a "good Python program" may be. YMMV
ChrisA
[] -- Best regards, Paul mailto:pmiscml@gmail.com

On Thu, Dec 3, 2020 at 8:31 AM Paul Sokolovsky <pmiscml@gmail.com> wrote:
[... etc ...] I'm not sure if anyone else feels like doing the work of assembling the various insults and dismissive, arrogant tone stuff to forward to the moderator. Is it still Brett and Titus? I'm certain it's more than enough to earn Paul a healthy time-out from the list. For my own purposes, I'm just going to killfile him on the list, and ignore threads of other people responding. Unfortunately, this tone is spread over several different subject lines, so just one won't mute it. Yours, David... -- The dead increasingly dominate and strangle both the living and the not-yet born. Vampiric capital and undead corporate persons abuse the lives and control the thoughts of homo faber. Ideas, once born, become abortifacients against new conceptions.

Hello, On Thu, 3 Dec 2020 16:25:09 +0000 David Mertz <mertz@gnosis.cx> wrote:
If there's a last word to have, I'd prefer to be banned by Brett. That's his write-up I take as inspiration for these ideas: https://snarky.ca/what-is-the-core-of-the-python-programming-language/ [] -- Best regards, Paul mailto:pmiscml@gmail.com

On Tue, Dec 01, 2020 at 06:53:35PM +0300, Paul Sokolovsky wrote:
I never think *only* of CPython when reading PEPs. Unless the PEP is clearly and obviously about an implementation detail of CPython, I always read them as proposing a *Python* language change. Doesn't everyone? But I don't understand your comment. Are you suggesting that CPython will be exempt from this proposal? Or that the proposal is aimed at helping alternative implementations? Should we be thinking specifically of some alternate implementation? If not, then the purpose of this "very important point" isn't clear to me. -- Steve

CPython Extension Proposals should be called CEPs [image: --] Felipe V. Rodrigues [image: https://]felipevr.com <https://felipevr.com> On Tue, Dec 1, 2020 at 5:47 PM Steven D'Aprano <steve@pearwood.info> wrote:

Hello, On Wed, 2 Dec 2020 07:47:53 +1100 Steven D'Aprano <steve@pearwood.info> wrote:
I just think that some of people who think in terms of "Python" == "CPython", would have hard time imagining how this "strict mode" came by, if they will try to apply it the CPython situation. I also don't think that it applies to CPython's approach well. If anything, it's kind of "contrarian" approach to how CPython tries to do things (e.g. https://www.python.org/dev/peps/pep-0509/).
Should we be thinking specifically of some alternate implementation?
I guess that the thinking in terms of "tabula rasa" would help. Like, if you forget all the existing CPython baggage, and just face a problem of optimizing name lookups, what would you do? This proposal is one such exploration. Also Steven, it's literally the response I promised in our initial thread on the "const" annotation, https://mail.python.org/archives/list/python-ideas@python.org/message/SQTOWJ... As you can see, it's much more far-fetched than just how to implement the runtime honoring of the "const" annotation. Because it tries to "extract" as much as possible already existing constness in Python programs, and then drafts how to use that to optimize lookups (en masse, as there're a lot of constness indeed). If we only need to handle explicit "const" annotations, it's a pretty small subset of this proposal.
If not, then the purpose of this "very important point" isn't clear to me.
[] -- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 02, 2020 at 12:24:00AM +0300, Paul Sokolovsky wrote:
Are you aware that CPython is the reference implementation of Python the language? If you are introducing language features and semantic changes that CPython doesn't support, or might not support, than it isn't Python, it's a superset (or subset) of the language. Like Cython. Either that, or you are hoping for a revolution where your proposed alternative language becomes so successful that a majority of users abandon CPython, PyPy, Nuitka, Stackless, Jython, IronPython etc and flock to your restricted language, taking the name "Python" with it. I don't think that's very likely. So I don't think you should be thinking about scenarios where other implementations take this up and CPython doesn't. Rather, I think there are three scenarios: 1. Somebody invents a new language with your proposed semantics, similar to other Python-inspired languages as Cython, Boo, Nim and Cobra. (At least one of those, Cython, is a superset of Python; the others are just copying syntax rather than semantics.) 2. CPython takes this up, as an optional extension that other implementations are free to ignore. 3. Or CPython takes this up, and makes it mandatory for any interpreter claiming to be Python to support it. I don't think there is any chance of your proposal being blessed as official Python semantics without CPython supporting it.
This would be scenario number 1 above. A brand new language, inspired by Python, like Cobra. Or since it is backwards-compatible in non-strict mode, perhaps Cython is a better analogy. Or if you prefer, the Perl-inspired Raku. -- Steve

Paul wrote:
(Aside: the literal `1` is not a variable, and variables aren't first-class citizens in Python. We can't bind "a variable" to a name, only the variable's value.) Is monkey-patching disallowed because `mod.func1` is defined as a constant? Or are all functions automatically considered to be constants? If the later, then decorators won't work in strict mode: @decorate def func(): ... is defined as: # define-decorate-replace def func(): ... func = decorate(func) so if functions are automatically const, decorators won't work. Occasionally I find that decorator syntax is not sufficient, and I've used the explicit "define-decorate-replace" form. That won't work either. Is it only *monkey-patching* from outside of the module that is disallowed, or any rebindings to functions, including within the owning module itself? If the later, that's also going to break the common idiom: def func(): # Slow Python version. # maybe replace with fast C version from c_accelerators import * under strict mode.
# Way to define a constant. my_cnst: const = 1
That clashes with type annotations. Constantness is not a type of the value, it's a statement about the name binding. x: int = some_expression is a statement about the permitted values bound to `x`. The type checker can take it that `x` will always be bound to an int, and reason from that. x: const = some_expression tells the type-checker nothing about what type `x` is. Unless it can infer the type of `some_expression`, it could only guess that `x` might be Any type. That's going to hurt type-checking. It would be better to introduce an independent syntax for constants. Let's say `const`: const x: int = some_expression can tell the reader and the type-checker that `x` is an int, even if they can't infer the type from the expression, and tell the compiler that `x` is a constant.
# Leads to a warning: replacing (monkey-patching) a constant slot. my_cnst: const = 2
A warning? What's the use of that? Are your constants *actually constant* or are they just advisory? If they're just advisory, then we might as well stick with the convention to use UPPERCASE names and use a linter that warns if you re-bind a "constant". If you can rebind constants by just suppressing or ignoring the warning, then people will rebind constants. Relevant: Sometimes I run GUI applications from the command line. Invariably they generate a flood of runtime warnings. Clearly developers are paying no attention to warnings.
That seems like a pointless complication. If mod is already imported, why would I need to import it again from inside the function? Just to turn it into a local variable? mylocal = mod Forbidding any imports inside a function would be annoying and frustrating, but at least there's no special cases and exceptions. Forbidding *some* imports, but allowing *unnecessary and redundant* imports inside a function makes no sense to me.
# RuntimeError my_cnst = 3
`my_cnst` is inside a function, so it should create a local variable, not attempt to rebind a global constant.
# RuntimeError mod.func2 = lambda x: 1
Yes you already make it clear that rebinding functions is not allowed.
I know "global variables considered harmful", but this looks to me like punishing users of globals for being bad by restricting what they can do to make their use of globals *worse* rather than better. - all globals must be pre-declared and initialised before use; - functions cannot clean-up after themselves by deleting their unneeded globals. These two restrictions will give the coder annoyance and frustration. What advantage does this provide to make up for that?
# Cheats don't work globals()["new"] = 1
That seems like it will probably break a lot of code, assuming you can even enforce it. Is your proposal for globals() to no longer return the global namespace dict?
So is `fun`. Are you aware that Python's execution model treats: - function *objects* as first-class values, same as ints, strings, floats, lists, etc - and *names* bound to function objects ("functions") as no different from names bound to any other object? You seem to be introducing a complicated execution model, where names bound to functions are different from other names, and functions are not first-class values any longer, but either privileged or restricted, depending on whether you think "functions are values" is a misfeature to be removed or a feature to be cherished.
# fun2 is an alias of fun fun2: const = fun
I don't really understand the purpose of your two modes here. In one mode, the interpreter automatically calls `__main__`; in the other mode, the interpreter runs the standard `if __name__ ...` idiom and then calls `__main__`. How does the interpreter know which mode it is in? Presumably there must be a "strict" modal switch. If that's the case, why not use the presense of that switch to distinguish strict mode from Python mode, instead of this convoluted plan of sometimes automatically calling `__main__` and sometimes not? This mandatory-but-only-sometimes special function seems pointless to me. Just because we're running under strict mode, why should the `if __name__` idiom stop working? It's just code. Your strict mode has to go out of its way to prevent it from working. I'm sure you know that if this proposal goes ahead, people will immediately demand that `__main__` is automatically called in non-strict mode too. So I can't help but feel you are using this as a Trojan Horse to sneak in a stylistic change: replace the `if __name__` idiom for automatically calling a special, magic function. That seems to have zero technical benefit. What if I want to call my application entry point `main()` or `Haupt()` or `entrypoint()`? What if I put the `if __name__` idiom *above* the special magic function? Does it still get magically ignored? if __name__ == '__main__': print('Starting') def __main__(): # magic print('Running') __main__() if __name__ == '__main__': print('Goodbye') I *think* that under your proposal, under regular Python mode, it will print Starting, Running, Goodbye, but under your strict mode, it will print Starting, Running *twice*, and then exit. Maybe. It's not very clear. It could print Running only, and not print Starting or Goodbye at all. Your proposed strict mode doesn't seem to actually be limited to making Python "stricter" for the purposes of optimization or clarity, but also seems to include changes which seems to be more *stylistic* changes which (probably) aren't necessary from the implementation: * globals must be defined in the top level of the module; * global constants override local variables; * functions aren't variables like everything else; * enforced and automatic special entry-point function; * discourage the `if __name__` idiom by making it half redundant; etc. -- Steve

On Wed, Dec 2, 2020 at 10:45 AM Steven D'Aprano <steve@pearwood.info> wrote:
Even though decorator syntax is described as being equivalent to define-decorate-replace, it actually assigns only once. So if the definition of "constant" is "may only be bound once", a decorated function would be fine. (But doing it explicitly wouldn't.) ChrisA

Hello, On Wed, 2 Dec 2020 10:42:25 +1100 Steven D'Aprano <steve@pearwood.info> wrote:
Nobody calls literal "1" a variable. That example explores the meaning of: def func1(): pass func1 = 1 And from a PoV of a human programmer, it's like written in the comment - first, the symbol (name) "func1" is defined as a function, and then it's redefined as a variable.
Is monkey-patching disallowed because `mod.func1` is defined as a constant?
What "disallowed" do you mean? The example above clearly says "Leads to a warning". At "run-time" (i.e. eventually, after you've done the monkey-patching), it's disallowed, yes.
Or are all functions automatically considered to be constants?
That's the case, right.
That's "conceptual model". How it's actually implemented is, "bu abuse of notation" is: def decorate(func): ... Works without hitch with the strict mode.
You'll get a warning, and if you're smarter than the strict mode in that case, you can disable it (in the same sense that you can disable any warning in Python).
In the *run-time* (aka "eventually") any "assign" operation on a const name is disallowed, and additionally adding any new names is disallowed, ditto for deletions. At the *import-time*, everything is allowed.
Will lead to a warning. Got bitten by that myself, yeah. (I llllove "from foo import *"). Had to finally implement __all__ in Pycopy due to those strict mode problems, can you believe? Anyway, re: example above would either need to mend your your ways (e.g. run "import *" first then conditionally define missing funcs), or disable the warning. [Have to stop here, will look at the rest later, thanks for comments.] [] -- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 02, 2020 at 10:52:30AM +0300, Paul Sokolovsky wrote: [Paul]
[...]
Those two code snippets demonstrate different things. Your original example demonstrates assigning to an module attribute from outside of the module. Your new example demonstrates assigning to a global variable from inside the same module. These call different byte-codes in CPython (STORE_ATTR and STORE_NAME). There is, as far as I know, no way to hook into and prevent STORE_NAME as yet. But STORE_ATTR calls are easy to override: ```
> And from a PoV of a human programmer, it's like written in the comment
> - first, the symbol (name) "func1" is defined as a function, and then
> it's redefined as a variable.
It sounds like you are describing a programmer who doesn't understand
Python's execution model.
Do you understand that "func1" is already a variable? In the sense that
"variables" are name bindings, which is the only sense that applies to
Python. Just because it is bound to a function object doesn't make it
something different -- "func1" is still a name bound to a value, which
is the Python concept of a variable.
Paul, I would have more confidence in your proposal if you described
your proposal in terms of Python's existing execution model. Because so
far, the language you use to describe your proposal sounds like the
language used by somebody who doesn't understand Python's execution
model.
That does not give me confidence in your understanding of the
consequences of these changes, or that you are coming to this proposal
as somebody who loves Python. If you want to program in another language
with radically different semantics, there are hundreds of languages that
aren't Python. You don't have to turn Python into one of them.
> > Is monkey-patching disallowed because `mod.func1` is defined as a
> > constant?
>
> What "disallowed" do you mean? The example above clearly says "Leads to
> a warning". At "run-time" (i.e. eventually, after you've done the
> monkey-patching), it's disallowed, yes.
What disallowed do I mean? Exactly the same disallowed you just agreed
with.
> > Or are all functions automatically considered to be
> > constants?
>
> That's the case, right.
Okay, so all functions are automatically constants.
[regarding decorators]
> That's "conceptual model". How it's actually implemented is, "bu abuse
> of notation" is:
>
> def decorate(func):
> ...
>
> Works without hitch with the strict mode.
Okay, Chris made the same point.
Is that a language guarantee or a quirk of the implementation?
For this to be guarenteed to work requires the implementation of
decorators to be guaranteed by the language. If it is not already a
guarantee, it will have to be made one.
(For the record, that is not a point against your proposal, just an
observation.)
[...]
> > Occasionally I find that decorator syntax is not sufficient, and I've
> > used the explicit "define-decorate-replace" form. That won't work
> > either.
>
> You'll get a warning, and if you're smarter than the strict mode in
> that case, you can disable it (in the same sense that you can disable
> any warning in Python).
So "strict mode" isn't actually strict, it's just a built-in linter.
--
Steve

On Wed, Dec 2, 2020 at 9:18 PM Steven D'Aprano <steve@pearwood.info> wrote:
Yes, it's part of the original PEP: https://www.python.org/dev/peps/pep-0318/#current-syntax "without the intermediate assignment to the variable func" It's not often significant, but when it is, it's definitely the better way to do it. For one thing, it simplifies the understanding of idioms like this: @spam.setter def spam(self, newvalue): ... And it's not something I do often, but sometimes I've built a decorator that actually looks at the previous binding of a name. Something like this (hypothetical): @validator def frobnicate(thing): if not frobbable: raise Exception When called, the final frobnicate function would check the validator, and if it doesn't error out, would call the original function, whatever that had been. To do that, the validator() decorator function has to be able to see what was previously bound to that name. ChrisA

Hello, On Wed, 2 Dec 2020 21:15:22 +1100 Steven D'Aprano <steve@pearwood.info> wrote:
Nice language-lawyering pick! Behold my counter: Generally, the background idea of this proposal is to treat Python as "a generic programming language", and not to go to deep into (CPython) jargon and implementation details. From that point of view, those two things demonstrate very similar things: in 1st case, "func1" is assigned in another module's namespace, in 2nd case, "func1" is assigned in the current module's namespace. Implementation details you raise are insightful, but aren't needed for audience to understand those snippets. Generally, Steven, I know that you're among the people who could find *real* issues with my proposal. So, I look forward to, if you would so like, us to finish with this language-lawyering preamble, and get to the substance of the proposal.
There is, as far as I know, no way to hook into and prevent STORE_NAME as yet.
And yet I'm doing just that. You just didn't reach yet in your reading the section of https://github.com/pycopy/PycoEPs/blob/master/StrictMode.md#implementation . You'd immediately remember that we've got STORE_NAME covered. STORE_GLOBAL is a culprit however. []
No, I'm describing a programmer who understands Python's execution model well enough, and "fed up" with some parts of it, and looking forward to some adjusted execution models to expand their horizons. The whole point of the proposal is to introduce a new execution model, I hope the beginning of the proposal, "This proposal seeks to introduce opt-in "strict execution mode"" rings enough of a bell. I'm sorry if it didn't.
Do you understand that "func1" is already a variable?
Not only I understand that, that's exactly what I seek to mend. Generally, Steven, let me draw a picture. For a moment, imagine that we both know Python execution model. Then: You: Look at it and... like what you see. I: Look at it and... spotting problematic things to adjust.
That could be a fair point. But let me ask a fair question: what percentage of the full proposal have you read? Because if it's sufficiently less than "entire", than your premonitions could be premature. If anything, I'm glad I've posted the "TL;DR" version, because the real proposal has got exactly 0 responses so far, while so many people were already kind to kick at the direction of the "TL;DR". (And given that I explicitly said I'm going to post this proposal "for beating", kicks are exactly the expected outcome.)
Love can be hard, too.
But that's a part of meta-motivation of the most things I do to Python, sorry for not keeping you in loop! So, as someone who was long time with Python and knows it pretty well, I also know many of its issues. And I was looking for alternatives. And I found out that almost any other reasonable choice would give me more static/strict language nature. E.g., among the common crowd of Ruby/JavaScript/Lua, Python is the strongest-typed alternative (while being dynamically typed). And heck, I have to admit, I like that, so I don't downgrade to them. And the rest of crowd, are rather too static to be practical for human-scale computing (not touching on problems of syntax ugliness, bloat, and concerns of being advertisement product of a media company). So, I decided to make a contrarian move: sit on the things I like in Python, and work on changing things I don't like.
You don't have to turn Python into one of them.
Believe it or not, but the proposal is so cute, that "addresses" even that point. It's at the very end, titled "Trivia". I'll let you scroll to it at your pace. But just in case, it's along the lines of: "Python was kidnapped by the secret Smalltalk cult! Freedom to Python!" ;-) [the rest later] -- Best regards, Paul mailto:pmiscml@gmail.com

Hello, On Wed, 2 Dec 2020 21:15:22 +1100 Steven D'Aprano <steve@pearwood.info> wrote: [First part was answered previously.]
What's allowed and what's disallowed depends on whether a program just "loads" or "actually starts to run". While it loads, it can do all the tricks Python can do now. When it starts to run, then the actual "strict" restrictions are applied. That's the crucial part of the proposal, required to understand other aspects of the proposal.
Yes, that's the whole idea. []
It's strict. And it has many (more than one for sure) uses. One of the uses is to improve code structure (or understanding of it) of the existing codebases.
-- Steve
-- Best regards, Paul mailto:pmiscml@gmail.com

Hello, (As the thread continues, and goes from "why did you do that" to actual technical details/implication, I have little choice (if to maintain coherent discussion) but go thru existing posts looking into technical aspects of the proposal.) On Wed, 2 Dec 2020 10:42:25 +1100 Steven D'Aprano <steve@pearwood.info> wrote: [First part was answered previously.]
"const" is a *type annotation*.
Constantness is not a type of the value, it's a statement about the name binding.
Your interpretation, my interpretation: 'const is a type annotation like any other'. This poses a question of type annotation composability. Outside the scope of this discussion (a solution exists). All this was mentioned already in previous discussions. []
Yes, and that was mentioned already. But we'll do that only when we play enough with "const" as an annotation and see that it's so useful that is worth being promoted to a keyword. (I'm sure we'll end up there; but the current state is that not everyone is convinced of useful of "const" at all. So, no hurry, step by step.) []
A use of warning in programming language? A generic question outside the scope of the current discussion.
Are your constants *actually constant* or are they just advisory?
I already answered how my proposal works in this thread. And it's fully described in the proposal. So, please refer to it.
Maybe you can stick with UPPERCASE names. I'd prefer something more generic.
If you can rebind constants by just suppressing or ignoring the warning, then people will rebind constants.
If they no what they're doing, let them do that (while they can, until the "run-time" phase starts.)
Outside the scope of the discussion. E.g. C compilers have -Werror for that.
You missed the point, it's the opposite situation: "mod" is imported inside the function to start with. The rest of the discussion is how to accommodate that in the strict mode, given that "on the surface", that's prohibited in the strict mode. []
Good catch. "global my_cnst" was clearly missing before that. Fixed in https://github.com/pycopy/PycoEPs/blob/master/StrictModeTLDR.md
No, you missed the critical part of the proposal: that there're 2 phases of the execution of a Python program: "import-time" and "run-time". Things at the global scope of that sample show rules for import-time (lax, even const's can be redefined), while code inside function executes at "run-time" (strict in all its full glory). The information about 2 distinct execution phases is presented clearly even in the TLDR version. Thanks for missing it.
They won't be punished much more than the already existing "considered harmful" background establishes ;-).
- all globals must be pre-declared and initialised before use;
Already the best practice.
- functions cannot clean-up after themselves by deleting their unneeded globals.
Modules can. Functions? Never saw that, and that's clearly "not Pythonic". (But one of "dirty Python tricks".) []
The code which will be broken - it won't be able to run in the strict mode. It either will need to be fixed to "not do dirty Python tricks", or be left forever to work in "standard" mode.
Is your proposal for globals() to no longer return the global namespace dict?
At run-time, it would be wrapped in a read-only proxy. That's all explained in the proposal. Sorry, did you read it?
In with the strict mode.
Are you aware that Python's execution model treats:
I'm aware, yes. Are you aware that the strict mode proposal seeks to change that?
You seem to be introducing a complicated execution model,
Not *that* complicated is you think about it.
No, the model is different: function bindings are by default immutable bindings, which is the best practice for most languages. Only when proven otherwise, a function binding is promoted (or demoted, depending on your outlook) to a mutable binding.
Well, that's because one mode already exists, and another I'm adding in the proposal.
How does the interpreter know which mode it is in?
Described in the proposal.
Described in the proposal.
Described in the proposal.
What the people may demand some time later is outside the scope of the proposal.
I do not. For the people that you mention above my response is "no". (Aka: use the strict mode if you want to get tasty goodies, or stick with the old stuff.)
Strict mode doesn't change statement-by-statement sequential execution of the Python programs. Nor there's any magic in it. All is explained in the proposal.
No, it will print Running twice and nothing else.
That's true, and fully disclosed in the proposal (at the very beginning): "However, it is also believed that the "strict mode" may also help on its own with clarity and maintenance of large(r) Python codebases."
No, it must be presented for compatibility.
[] -- Best regards, Paul mailto:pmiscml@gmail.com

Hello, I see that code listing was partially garbled (code merged into some comments). It shouldn't be too bad to disambiguate it, but let me try to repost the code again: ``` import mod # Leads to a warning: replacing (monkey-patching) a constant slot (function) with a variable. mod.func1 = 1 # Leads to a warning: replacing (monkey-patching) a constant slot (function). mod.func2 = lambda: None # Way to define a constant. my_cnst: const = 1 # Leads to a warning: replacing (monkey-patching) a constant slot. my_cnst: const = 2 glb1 = 100 def fun(): # Imports are not allowed at run-time import mod2 # But you can re-import module previously imported at import-time. import mod # RuntimeError my_cnst = 3 # RuntimeError mod.func2 = lambda x: 1 global glb1, new # RuntimeError: Cannot create new global nameslots at runtime. new = 1 # Nor can delete existing del glb1 # Cheats don't work globals()["new"] = 1 # Leads to a warning: replacing (monkey-patching) a constant slot (function). def fun(): pass # fun_var is a variable storing a reference to a function (can store ref # to another func). fun_var = fun # fun2 is an alias of fun fun2: const = fun # Run-time execution starts with this function. This clearly delineates # import-time from run-time: a module top-level code is executed at # import-time (including import statements, which execute top-level code # of other modules recursively). When that is complete, strict mode # interpreter switches to run-time mode (restrictions enabled) and # executes __main__(). def __main__(): fun() # This statement is not executed when program runs in strict mode. # It is executed when it is run in normal mode, and allow to have # the same startup sequence (execution of __main__()) for both cases. if __name__ == "__main__": __main__() ``` On Tue, 1 Dec 2020 18:26:48 +0300 Paul Sokolovsky <pmiscml@gmail.com> wrote:
[] -- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 2, 2020 at 2:29 AM Paul Sokolovsky <pmiscml@gmail.com> wrote:
Wait, what? No. No no no. Please do not do ANYTHING like this. Having suffered under JavaScript's highly restrictive import system (and actually been glad for it, since the alternative is far worse), I do not want ANY version of Python to give up the full power of its import system, including the ability for a module to be imported only when it's actually needed. Imports inside functions allow a program to have optional dependencies, or dependencies that might be slow to load (eg numpy), and without that, even running your script with "--help" has to process every single import in the entire file. -1000. ChrisA

Hello, On Wed, 2 Dec 2020 02:39:38 +1100 Chris Angelico <rosuav@gmail.com> wrote:
Then it's luck that ALL versions and dialects of Python aren't under your control ;-).
But didn't you you already spotted a line which says that the strict mode also aspires to improve on the Python module practices? Under strict mode's firm but benevolent rule, there won't be slowly-loading modules any more. All imports will be fast. And modules which want to be slow will do that in their module.init() function.
-1000.
I also forgot to mention very important point in the intro: when you read this proposal, please don't think about "CPython". That for sure will send you the wrong vibes. Think about "Python". ;-)
ChrisA
-- Best regards, Paul mailto:pmiscml@gmail.com

On Tue, 1 Dec 2020 at 15:55, Paul Sokolovsky <pmiscml@gmail.com> wrote:
By which you mean "CPython and all other implementations" I assume? I'm also -1000 on this proposal, even if you just limit it to CPython. It's not at all clear what prompted this idea, but if you're suggesting "modify Python so we can make it faster" then I'd rather see a prototype implementation that demonstrated both the limitations *and* the improved performance. We shouldn't limit the core language based simply on speculative benefits that some implementation might be able to achieve. Paul

There exist TWO highly successful, widely used, JIT compilers for Python. PyPy and Numba. Neither one of them would have any use whatsoever for this constantness. Or if you believe otherwise, get a developer of one of those to comment so. JIT'd Python simply is not slow, even compared to compiled languages. Searching for maybe-possibly-someday optimizations while ignoring the actual speed paths, is silly. But I'll be moderate and only vote -100, 10x less negative than Paul Moore and Chris Angelico :-). On Tue, Dec 1, 2020 at 4:03 PM Paul Moore <p.f.moore@gmail.com> wrote:
-- The dead increasingly dominate and strangle both the living and the not-yet born. Vampiric capital and undead corporate persons abuse the lives and control the thoughts of homo faber. Ideas, once born, become abortifacients against new conceptions.

I think that what you want is another language, that already exists and it's RPython: https://rpython.readthedocs.io/en/latest/rpython.html See constants paragraph. RPython is used to create PyPy, not to limit normal Python programming :-)

Hello, On Tue, 1 Dec 2020 21:05:23 +0100 Marco Sulla <Marco.Sulla.Python@gmail.com> wrote:
I embarked to read RPython docs several times, and always found that I stop right at the top of that page, at the phrase "The exact definition is “RPython is everything that our translation toolchain can accept” :)". You know, I can very relate to that phrase. As a little guy, that's how I write my stuff - no proper docs, no nothing, and when somebody comes by asking what's supported or similar, the response is "RTFC!" (where "C" is "code"). But sorry, I just can't take that from a project which for a long time received non-trivial funding and which is supposed to serve as a base for other projects. You can compare that with my "strict mode" proposal where I try to spell out how the flaming thing works, even though the idea is banally simple. That said, that's my experiences with RPython. What are yours? What have you written with it? I'm especially interested in memory requirements. "Strict mode" is implemented in Pycopy, which can do more or less useful things in 16KB of heap, and the strict mode doesn't regress that much. How does RPython feel in 16KB? [] -- Best regards, Paul mailto:pmiscml@gmail.com

Hello, On Tue, 1 Dec 2020 19:09:28 +0000 David Mertz <mertz@gnosis.cx> wrote:
Of course not. What PyPy needed were European Commission grants, nothing else. Numba needed a lot of corporate backing for sure too, but they also needed LLVM bindings. So they picked some guy's module. Then they threw it away, because it was hard to maintain, and made their own. While LLVM has always shipped Python bindings. That's absolutely normal, but shows there's a lot of "random betting" happens even in "highly successful" projects. But most importantly, I'm not interested in survivorship bias. I'm not much interested to know how to write million-dollar JIT projects, because heck, that's known for decades. I'm interested to know how to NOT write "million-dollar" JIT projects, and why Unladen Swallow, Pyston failed. I'm also interested to know how to write JIT projects which do NOT cost millions of dollars. All that is quite an unusual hobby, you bet.
Yeah! It's just don't work for stuff you need in a way you need, and too bloated, so when you want to fix it, you can't.
Exactly interested in low-hanging optimizations. Much less interested in approaches like "we'll feed in some random crap, and LLVM will take care of it". So, hopefully, the motivation is clear - I'm doing this stuff, because it's so obvious thing to do, and the guys who got the same idea in 2001 or so, didn't seem to have left tangible artifacts beyond deadlocked PEPs.
But I'll be moderate and only vote -100, 10x less negative than Paul Moore and Chris Angelico :-).
[] -- Best regards, Paul mailto:pmiscml@gmail.com

Hello, On Tue, 1 Dec 2020 16:02:21 +0000 Paul Moore <p.f.moore@gmail.com> wrote:
No, I mean exactly what's written in the proposal: "This proposal seeks to introduce opt-in "strict execution mode" for Python language implementations interested in such a feature."
I'm also -1000 on this proposal, even if you just limit it to CPython.
It's not at all clear what prompted this idea,
The proposal is 32+K (whoa, am *I* scribbled all that?!), so I suspect somewhere in there it makes it clear(er).
Yeah, I'd like to see that too! I just calculated that if betting on myself, it may take years, or maybe I give up or switch over to something else, like most other people do. So, I decided to throw over the fence what I have now - the idea, more or less detailed "spec" for it, and even an implementation in a niche Python dialect, but exactly the one of the kind which may benefit from it. Just imagine that if someone wrote previously such a detailed spec, which I liked - I might implement it now. And if they actually even provided a sample implementation, I might now code changes for it in the compiler, and maybe even run a few tests to provide those performance figures which you and me so much would like! Getting thoughts like that, most people I know would reflect them on themselves, and I'm not an exception. So, I post whatever I have for peer review, and continue.
That's why it's introduced as an *opt-in* feature for *interested* implementations.
Paul
-- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 2, 2020 at 6:11 AM Paul Sokolovsky <pmiscml@gmail.com> wrote:
Maybe flip this around a bit -- it seems that RPython, from the PyPy project is an implementation of a "strict" version of Python. You mentioned that it was frustrating that it does not have a properly written (or any?) spec. but it DOES have an implementation, so maybe you can write the spec instead of the implementation :-) Honestly, I myself have MANY times decided to write my own thing, rather than take the time to figure out what someone else already wrote -- even though it usually takes longer and gets a worse result :-) I have not followed PyPy closely, but I suspect that the lack of a formal spec for Rpython is not laziness, but rather that the entire point of it is to support PyPy, so they want to be free to adjust it as needed, rather than sticking to a a spec (or constantly trying to maintain a spec) Nevertheless, there are probably some really good lessons in there, and it would be very interesting to see if (a version of) RPython could reasonably be used to directly write general purpose programs. Other than the fact that a lot of work has already been done on it, RPython has the advantage that (presumably) its restrictions are there because they have been shown to help performance. Another option would be to build on something like Cython -- taking advantage of the type specifications at run time, without pre-compiling the entire module. NOTE on that: back when there was a lot of discussion about standardizing type hints, I asked about making them work with, e.g. Cython, to get performance benefits. The answer at that time was that performance was NOT the point of type hinting -- i.e. it was designed explicitly to support Pyton's dynamic nature. So it seems adding things like you are proposing with an eye to performance is not really where the Python community wants to go. -CHB
-- Christopher Barker, PhD Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython

Hello, On Wed, 2 Dec 2020 09:53:07 -0800 Christopher Barker <pythonchb@gmail.com> wrote:
No, it's "restricted" version of Python, that's what "R" in RPython means. "Flip around" is very related however. In a sense, RPython and "strict mode" approaches are opposite: RPython starts with a simple subset of Python (and let you write a full Python in it), while "strict mode" takes a full Python and tries to take away from it as little as possible to achieve the desired effect (optimizing name lookups in this case). I hope you agree the difference in approaches is quite noticeable.
It definitely has some docs. Yes, I would say they're not ideal.
but it DOES have an implementation, so maybe you can write the spec instead of the implementation :-)
For that I would need to use RPython. I considered that circa 5 years ago, and of course explored it. I "liked" what I saw, sure. But I "wasn't happy" with what I saw. There seem to be hints that I may be making this "strict mode" thingy because I wasn't aware of PyPy/RPython. But actually it's the opposite: being aware of RPython and knowing enough about it is what causes me to explore a different direction.
I'm an open-source guy all along, and consider duplication of effort to be one of "mortal sins" of the open-source. Beyond that, I'm also a lazy dude who prefers to spend his days on the beach and not in the front of computer. Neither I ever was mad enough to "write my own thing". I wanted to write a small Python literally for decades, but instead studied what other people did/do. I for example rejected TinyPy as a base, due to its source code quality (but its spirit is very high, I consider my Pycopy to be spiritual successor of it). Then when MicroPython was announced on Kickstarter, I literally talked its author to open-source it a half-year earlier, and contributed to it for several years, literally having written a third of it. It's the same with the "strict mode". While there's a bunch of "original research" in it, it's all based on (or related to) ideas expressed by other people, which I made sure to have studied before proceeding to implementation (references are given in the proposal).
I fully agree, and fully understand that. Again, that's a reason to explore a different direction, not against it. ("That area is already covered by good people, let's look elsewhere.")
It's not, and that's disclosed right away. If anything, lack of "too beautiful docs" is related to PyPy's project desire to not make RPython a standalone, general-purpose dialect of Python. Nor it's comparable to a "normal" Python, usually presented as "C-like subset" of Python. (I'm writing this by memory and don't have references at hand, so specific terminology used by them may be different).
I remember that very well, and that's another point I don't agree with.
I never met a Python user who said something like "I want Python to be slow" or "I want Python to keep being slow", so we'll see how that goes.
-CHB
[] -- Best regards, Paul mailto:pmiscml@gmail.com

just one more note:
But many that might say "I don't want to make Python less flexible in order to gain performance" Of course no one one is going to reject an enhancement that improves performance if it has no costs. My thought on your idea is this: Yes, a more restricted (strict) version of Python that had substantially better performance could be very nice. But the trick here is that you are proposing a spec, hoping that it could be used to enhance performance. I suspect you aren't going to get very far (with community support) without an implementation that shows what the performance benefits really are. I'm just one random guy on this list, but my response is: "interesting, but show me how it works before you make anything official" -CHB ----- Christopher Barker, PhD Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython

On Sat, Dec 5, 2020, 3:03 PM Christopher Barker <pythonchb@gmail.com> wrote:
From another who like CHB is just random person on this list (but probably even more "random"), interested enough to have read the entire thread and the other thread, but not knowledgeable or competent enough to offer detailed comments that are going to be particularly helpful to anyone, I'd say this: If you could actually make a fully functioning python that is significantly faster by doing this, and it introduced this two-stage interpreter idea with a much more strict secondary stage, and did not require all kinds of additional syntax in the code to get the speed improvements (like cython for example), I'd think you really might have something that would help a lot of people see the benefits of a potential switch to a stricter paradigm of writing in an ostensibly dynamic language that nonetheless would now have to be written much more less dynamically when inside functions and class methods. It seems to me that if the speed increase is enough, it could be worth the decrease in flexibility, potentially. At least enough to support the existence of a second mode of python execution (whether that mode lives in cpython or not doesn't seem to much matter to me). However I think maybe a big big problem is probably going to be the lack of interest in very popular third party, and even standard, libraries to rewrite their code to fit a D1S2 (dynamic stage one, strict stage two) interpretation model. It seems likely many heavily used packages will simply be near totally broken for your strict interpreter, and many many others will need tweaking. So they will have to be rewitten, or at least tweaked, somehow. Maybe many could be rewritten automatically? I do not know. But I think you need to consider that you could get to the end of writing this thing and have it working perfectly with a major (10x? 50x?) speed improvement, and still have trouble getting people interested because you can't run code in, like, the enum or pathlib or functools library, or requests or numpy or something else. That would be a bummer. How do you see that problem getting solved?

Hello, On Sun, 6 Dec 2020 00:37:15 -0500 Ricky Teachey <ricky@teachey.org> wrote: []
That's exactly the interesting part, which would be interesting to discuss, with interested parties. Just to give an idea of my timeline: I coded the basic strict mode implementation for Pycopy in August last year. Then made another pass over in November last year. Before merging it to Pycopy mainline, I wanted to make sure it's viable with "general Python code". That's why for winter holidays 2019/2020 I coded up CPython pure-Python impl, https://github.com/pfalcon/python-strict-mode. Of course, I faced issues with CPython, went on to argue with CPython developers that they should fix their stuff, and then suddenly winter holidays were over. Fast forward to this November, I figure I'm not making progress. So I think "god cares about CPython software, I care about *my* software". I went to convert whatever codes I already had running in Pycopy to the strict mode and found it's not bad at all (fixed gazillion usability bugs with the strict mode, yeah). The spec, I started to write it, because after such a delay, my first reaction was literally the sane as @rosuav in his reply here on the list: "Wait, wtf we don't support dynamic module imports, I lllluuuuve dynamic module imports." So, I had to remind me why, and write it down this time. That's when open-source project get documentation - when the authors themselves find a need for it ;-). Bottom line, here's the biggest change I had to apply to my most mind-boggling dynamic-imports app: https://github.com/pfalcon/ScratchABlock/commit/ac2a9145ec8c05fe2be7c982d88a... The app allows to pass on the command line a dir name, which can be full of files, and then inside each file, there can be multiple module names to import. Whoa! Still, 25 lines to cover it. To see whether it's much or not, would need to compare what it would take me to do that in a static language. So, above I'm using Python as a kind of DSL for my app. In a static language, I would need to write a *real* DSL: all the lexer/parser/interpreter business. Not 25 lines at all. And in Python, I can pay 25 lines price to get rid of the most obnoxious Python misfeature comparing to a static language: blatantly inefficient namespace lookups. Again, I'd be only more interested to hear/see/tell more stories about that. Just need to start somewhere.
I don't see much of a problem at all. I see it the same way as e.g. Cython or Mypyc authors do: "to use this stuff, you need to change your Python code". So, what we need to compare is how much you need to change and what you get in return. The strict mode asks for rather modest changes comparing to the tools above. But neither it claims 10x-50x speed improvement. Actually, the idea behind the strict mode is not to make Python faster. It's to make Python *not slower*. In one particular area - name lookup (and then only static classes/functions mostly, but stay tuned for the strict mode part 2, where we brainstorm object method lookup). -- Best regards, Paul mailto:pmiscml@gmail.com

Hello, On Sat, 5 Dec 2020 12:02:52 -0800 Christopher Barker <pythonchb@gmail.com> wrote:
And I'd shake hands with them, because I add "strict mode" as an additional optional mode beyond the standard Python's mode. (I'd however expect that I personally use it often, because it's just a small notch above how I write programs in Python anyway.)
As I mentioned in previous replies, I fully agree that it would be nice to see performance figures. But sadly, as directly related to the strict mode, those aren't available yet. However, if the question is to explicate the idea further, that can be done on synthetic examples right away. Suppose we have a pretty typically-looking Python code like (condensed to save on vertical space): --- def foo(): a = 1; b = 2 for _ in range(10000000): c = min(a, b) foo() --- The problem with executing that code is that "min" per the standard Python semantics is looked up by name (and beyond that, the look up is two-level, aka "pretty complex"). Done 10 mln types in a loop, that's gotta be slow. Let's run it in a Python implementation which doesn't have existing means to optimize that "pretty complex" lookups, e.g. my Pycopy (btw, am I the only one who finds it weird that you can't pass a script to timeit?): $ pycopy -m timeit -n1 -r1 "import case1" 1 loops, best of 1: 2.41 sec per loop A common way to optimize global lookups (which are usually by name in overdynamic languages) is to cache the looked up value in a local variable (which aren't part of external interface, and thus are usually already optimized to be accessed by "stack slot"): --- def foo(): from builtins import min a = 1; b = 2 for _ in range(10000000): c = min(a, b) foo() --- $ pycopy -m timeit -n1 -r1 "import case3" 1 loops, best of 1: 551 msec per loop 4 times faster. So, the idea behind the strict mode is to be able to perform such an optimization automatically, without manual patchings like "from builtins import min" above. And the example above shows just the surface of it, for bytecode interpretation cases. But the strict mode reaches straight to the JITted machine code, where it allows to generate the same code for function calls as it would for C. The "code for function calls" is the keyword here. Of course, Python differs from C in more things that just name lookups. And most of these things are necessarily slower (and much harder to optimize). But the name lookups don't have to be, and the strict mode (so far) tries to improve just this one aspect. And it does that because it's simple to do, for very modest losses in Python expressivity (adjusted for real-world code sanity and maintainability). And it again does that to put a checkmark against it in move to the other things to optimize (or not).
It's nothing "official", it's completely grass-roots proposal for whoever may be interested in it. But I have to admit that I like it very much (after converting a few of my apps to it), and already treat it as unalienable part of the semantics of my Python dialect, Pycopy.
-CHB
[] -- Best regards, Paul mailto:pmiscml@gmail.com

On 2020-12-05 05:49, Paul Sokolovsky wrote:
The main thing I don't understand is what you expect to be the result of this proposal; or, in other words, what you expect anyone besides you to do with your idea. You say we shouldn't think in terms of CPython, that this is separate from CPython. You say you're also not interested in existing alternative Pythons or quasi-Pythons like Cython. So what is it you want? Some people here to say "Yes, I'll join you and work on a totally new restricted version of Python from scratch just to see what happens"? -- Brendan Barnwell "Do not follow where the path may lead. Go, instead, where there is no path, and leave a trail." --author unknown

Hello, On Sat, 05 Dec 2020 21:48:53 -0800 Brendan Barnwell <brenbarn@brenbarn.net> wrote:
Roughly, I expect it to be about the same as for e.g. https://www.python.org/dev/peps/pep-0509/ . The background motivation of the two is the same: look for ways to optimize name lookups. My however reaches "above surface" and requires some restructuring on the side of the Python programs, so potentially "affects" (as in: can elicit response) from wider audience than PEP509, which is clearly targeted at Python internals enthusiasts. Generally, as any human being, I'm interested to communicate, and potentially, even cooperate, with other alike-minded human beings. I don't pledge for the latter. I will understand even if nobody is truly interested in my idea. But it's my strong belief that there should be more people interested in this stuff (not just my particular proposal, but stuff related to conceptual, "meta", and on the hand of spectrum, implementation, matters in Python). So, I post to whoever may be lurking around, or who wanted to look at that stuff for a long, and bringing it up may be the "last straw" for them to actually dig into it. Receiving criticism from not-really-interested people is also helpful, but such a discussion quickly derails, as the experience shows. So, coming to specifics, some points which *could* be discussed: 1. The proposal makes claims that some of the restrictions imposed are already oftentimes imposed by codebases caring about their hygiene. It would be helpful to get (detailed enough, not low-effort) aye's or nay's. 2. For restrictions where the proposal goes beyond something which can be called "existing practices", how harsh are those, and what can be done about them? There's literally one (YMMV) "grave" thing in the proposal - prohibition of runtime imports. It's also outlined how to address them. So, if we get past Chris Angelico's "no no no no" for a curious walk, what do we see? As an example of possible p.1 discussion point, a case of global variable definition/declaration matter was brought up in another message. So, do you define your global variables at the module's top level? How do you feel about it? Is it: a) Are you nuts? How else can it be done? b) I do it. c) I don't do it. d) Defining the global variables at the global scope? Only over my dead body! -- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 2, 2020 at 2:53 AM Paul Sokolovsky <pmiscml@gmail.com> wrote:
If there are special-purpose cut-down implementations that provide a more restricted environment in return for some other feature (say, "being able to run in a web browser", or "being able to run on a microcontroller"), then that's fine, but that's not a language feature - that's a partial implementation that sacrifices the parts it can't afford to do. From the sound of your proposal, this should be part of the main Python language and be a normal expectation of code.
Still -1000 on that, partly because you can never guarantee that it'll be fast (if only because searching for and loading large numbers of files can be very expensive without any single one of them being blamed for the cost), and partly because two-phase initialization is something Python tries hard to avoid. You don't get code like this: f = file("/path/to/file") f.open() db = psycopg2.database("postgresql://user@localhost/dbname") db.connect() Two-phase initialization means that everything in that module (or class, as the case may be) has to first check if it's been initialized. Why not just initialize it when you create the object? Why not just initialize when you import the module? How much do you really gain by forcing this? You get stub modules everywhere that aren't initialized yet, but have to be imported just to satisfy this restriction. Current code simply... doesn't import it.
Yes, I'm thinking about Python, but unless you specifically tell me that this is a narrow special-purpose sublanguage, I'm going to assume you want CPython to at least respect this, even if it doesn't take advantage of it. ChrisA

Hello, On Wed, 2 Dec 2020 03:02:26 +1100 Chris Angelico <rosuav@gmail.com> wrote: []
Kinda yes, except not as vulgar as examples above. More like "being able to develop a JIT which doesn't cost million dollars" (extra points if that still can run on a microcontroller!).
The sacrifice is coming more from theoretical constraints on what's (easily) possible and what's not, not from vulgar constraints like "my microcontroller doesn't have space for that". How to be a dynamic language (addressing only dynamic name lookup here) and not suffer from that? How to have a cake and eat it? A simple answer (and we're interested in such here) is that you can't. First you have the cake - and heavy emphasis in the proposal is put on that, so you didn't start with just breadcrumbs. Then you eat the cake. To give the right idea of how to look at this stuff, you should compare this "strict mode" with Python compilers like Mypyc or Shedskin, and what restrictions they place.
From the sound of your proposal, this should be part of the main Python language and be a normal expectation of code.
I don't know where you got that sound from, because the very first sentence of the very first "Introduction" section reads: "This proposal seeks to introduce opt-in "strict execution mode" for Python language implementations interested in such a feature." So, is for example the CPython implementation interested? []
This proposal puts a heavy emphasis on a JIT usecase, and JIT is known to have its startup costs. JIT is also known to not be a panacea fro all usecases, why this proposal says that strict mode doesn't replace "normal" mode, some programs are just bound to be run in it.
and partly because two-phase initialization is something Python tries hard to avoid.
This proposal specifically introduces two-phase startup sequence, which does affect imports. Price of magic. Now, all that in detail discussed in the proposal: https://github.com/pycopy/PycoEPs/blob/master/StrictMode.md#dynamic-module-i... (yes, I put it in a repo to link to specific sections). Sneak peeks for you: "By far, the most "grave" issue with the strict mode is that dynamic (at run-time) imports are not supported" "Let us walk thru why" "To perform any non-trivial optimization on dynamic languages, whole-program analysis is required." "Don't get me wrong - I absolutely love dynamic imports! As an example, the strict mode was implemented in the Pycopy dialect of Python, and of 5 not completely trivial applications written for Pycopy, 5 use dynamic imports." "There is actually yet another option to tackle dynamic imports problem - to ease restrictions" So, I feel, and share, your pain ;-). []
I'd like to have a pure-python implementation running on CPython, yes. I'm not sure what you mean by "respect this". Any valid strict-mode program is also a valid normal-mode program. You can treat strict mode as a kind of type linter - it will pinpoint issues violating particular discipline (not totally insane), you fix them, and then can run without it too.
ChrisA
-- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 2, 2020 at 9:10 AM Paul Sokolovsky <pmiscml@gmail.com> wrote:
That's completely the opposite of what I was talking about, then. If the goal is to develop a JIT compiler, then it should surely be part of the main implementations of the language - the ones that are broadly compatible with each other. Cut-down Python implementations are NOT fully compliant with the language spec, and that's okay, because nobody expects them to be. If the restricted execution model is incompatible with most Python scripts, why would anyone bother to use it? Does it actually offer performance advantages better than PyPy, which JIT-compiles and is able to run normal Python scripts? ChrisA

Hello, On Wed, 2 Dec 2020 09:16:56 +1100 Chris Angelico <rosuav@gmail.com> wrote:
E.g. because someone who would want to experiment with JIT, would need to apply similar restrictions anyway.
As I mentioned in another response, it's all contrarian approach, right from the start. It's not whether PyPy offers performance advantages, it's whether it runs in the target environments I consider interesting at all. The answer is NO. I then consider what it does to NOT run in the environments I find interesting. It runs unmodified scripts, and throws unlimited amounts of memory at that. So, I see how to modify scripts in such a way to NOT throw unneeded memory at trivial things, while only improving the code hygiene. I get that it's hard to get ;-). Besides that, it also implements runtime support for "const" variables, which is closer to matters of CPython level. (E.g., if there's support for constants, CPython's pattern matching doesn't need to go for ugly workarounds of forcing to use "case Somewhere.SOMETHING", it can be just "case SOMETHING:").
ChrisA
-- Best regards, Paul mailto:pmiscml@gmail.com

Hello, On Wed, 2 Dec 2020 00:10:41 +0100 Marco Sulla <Marco.Sulla.Python@gmail.com> wrote:
That's my aspiration for the strict mode, yes. (I provide "full disclosure" on that fact.) Beyond that, strict mode, is well, strict. So, it may be interesting to people who want more "strictness" in Python. For example, some time ago, "type annotations" were introduced, and quite many people (though not everyone of course) aspire to make their programs more strict using them. The "strict mode" proposed here is similar, but explores different dimension of strictness.
Why can't this be done in a separate project, like PyPy or Pycopy?
Both (and many more around!) of those projects are Pythons. So, not only it can be, it should be done in as many projects as possible. (In which specific, is up to their maintainers.) [] -- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 2, 2020 at 6:20 PM Paul Sokolovsky <pmiscml@gmail.com> wrote:
Type annotations don't add any strictness. They just give information so that an external tool can choose to provide information to the coder. Python can't change its execution plans based on type annotations, because they have no actual runtime meaning; and the corollary of that is that type annotations are fairly safe. Annotating a function won't suddenly break it (assuming the annotation is syntactically valid). If constness can be used to actually change behaviour, then it really does have to restrict the language, and that's quite a different beast. ChrisA

Hello, On Wed, 2 Dec 2020 18:33:49 +1100 Chris Angelico <rosuav@gmail.com> wrote: []
Teh "strict mode" proposed here can be seen as such an external tool too. (It can be internal tool too.)
Python can't change its execution plans based on type
CPython can't, other Pythons can. Mypyc is a well-known Python which changes its execution plans based on type annotations. So, the culprit is the same: people continue to think Python == (current) CPython. We now seem to get to the point that most advanced people say "no, no, we understand the difference", but they continue to *think* that. They try to "size up" a change to their *CPython* experiences (slow-loading numpy, blah-blah), even though have been warned that way won't lead them anywhere. (They'll make a big circle and arrive to the conclusion "I can't use that with CPython right away", but that was the (implied) content of the initial warning). So again, you should not compare the "strict mode" with CPython. You should compare it with Mypy, Mypyc, Shedskin, Cython, pyflakes (nags you about things, and strict mode nags you too), etc. [] -- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 2, 2020 at 7:11 PM Paul Sokolovsky <pmiscml@gmail.com> wrote:
"Mypyc is (mostly) not yet useful for general Python development." So, it's not a fully compliant Python implementation. Is there any fully compliant Python that changes its behaviour based on type annotations? If not, "Python == CPython" isn't the problem here. ChrisA

On Wed, Dec 2, 2020 at 7:24 PM Chris Angelico <rosuav@gmail.com> wrote:
Forgot the citation: https://github.com/python/mypy/blob/master/mypyc/README.md ChrisA

Hello, On Wed, 2 Dec 2020 19:24:08 +1100 Chris Angelico <rosuav@gmail.com> wrote:
No worries, I don't claim that the strict mode is suitable for production use already. It's literally the first time a "full spec" (and still apparently subject to change") is posted and reference implementation is provided, beyond mentioning the idea here and there.
I'm not sure if you already smell it in the air, but the idea of a "fully compliant Python" is getting old (simply because it was there all this time, and people learned its drawbacks too). Various people are looking how to restrict it (e.g. apply *and* enforce type annotations), or put simple, look for "the good parts". I can't believe I use that term towards Python, as it has strong connotation of looking for gems in a total crap (e.g. https://www.goodreads.com/book/show/2998152-javascript for readers who are not in loop), but the recent decade showed that if applied consistently it can achieve even literally that effect. And just imagine what it can do if applied to a better-from-start language like Python! So, literally, more and more people are concentrating on a task of how to do better things with *good* Python programs, because how do to about *any* Python program is know for decades - just run it in a slow ugly interpreter, full of legacy [censored], commonly known as "CPython". Definition of "good" is still being sought for, and likely will vary from faction to faction. For example, there're people who seriously think that a "good Python program" is the one littered up to a sizable part of its content with ugly-looking type annotations of the current generation (already legacy, as pre-PEP563). My proposal gives an alternative example of what a "good Python program" may be. YMMV
ChrisA
[] -- Best regards, Paul mailto:pmiscml@gmail.com

On Thu, Dec 3, 2020 at 8:31 AM Paul Sokolovsky <pmiscml@gmail.com> wrote:
[... etc ...] I'm not sure if anyone else feels like doing the work of assembling the various insults and dismissive, arrogant tone stuff to forward to the moderator. Is it still Brett and Titus? I'm certain it's more than enough to earn Paul a healthy time-out from the list. For my own purposes, I'm just going to killfile him on the list, and ignore threads of other people responding. Unfortunately, this tone is spread over several different subject lines, so just one won't mute it. Yours, David... -- The dead increasingly dominate and strangle both the living and the not-yet born. Vampiric capital and undead corporate persons abuse the lives and control the thoughts of homo faber. Ideas, once born, become abortifacients against new conceptions.

Hello, On Thu, 3 Dec 2020 16:25:09 +0000 David Mertz <mertz@gnosis.cx> wrote:
If there's a last word to have, I'd prefer to be banned by Brett. That's his write-up I take as inspiration for these ideas: https://snarky.ca/what-is-the-core-of-the-python-programming-language/ [] -- Best regards, Paul mailto:pmiscml@gmail.com

On Tue, Dec 01, 2020 at 06:53:35PM +0300, Paul Sokolovsky wrote:
I never think *only* of CPython when reading PEPs. Unless the PEP is clearly and obviously about an implementation detail of CPython, I always read them as proposing a *Python* language change. Doesn't everyone? But I don't understand your comment. Are you suggesting that CPython will be exempt from this proposal? Or that the proposal is aimed at helping alternative implementations? Should we be thinking specifically of some alternate implementation? If not, then the purpose of this "very important point" isn't clear to me. -- Steve

CPython Extension Proposals should be called CEPs [image: --] Felipe V. Rodrigues [image: https://]felipevr.com <https://felipevr.com> On Tue, Dec 1, 2020 at 5:47 PM Steven D'Aprano <steve@pearwood.info> wrote:

Hello, On Wed, 2 Dec 2020 07:47:53 +1100 Steven D'Aprano <steve@pearwood.info> wrote:
I just think that some of people who think in terms of "Python" == "CPython", would have hard time imagining how this "strict mode" came by, if they will try to apply it the CPython situation. I also don't think that it applies to CPython's approach well. If anything, it's kind of "contrarian" approach to how CPython tries to do things (e.g. https://www.python.org/dev/peps/pep-0509/).
Should we be thinking specifically of some alternate implementation?
I guess that the thinking in terms of "tabula rasa" would help. Like, if you forget all the existing CPython baggage, and just face a problem of optimizing name lookups, what would you do? This proposal is one such exploration. Also Steven, it's literally the response I promised in our initial thread on the "const" annotation, https://mail.python.org/archives/list/python-ideas@python.org/message/SQTOWJ... As you can see, it's much more far-fetched than just how to implement the runtime honoring of the "const" annotation. Because it tries to "extract" as much as possible already existing constness in Python programs, and then drafts how to use that to optimize lookups (en masse, as there're a lot of constness indeed). If we only need to handle explicit "const" annotations, it's a pretty small subset of this proposal.
If not, then the purpose of this "very important point" isn't clear to me.
[] -- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 02, 2020 at 12:24:00AM +0300, Paul Sokolovsky wrote:
Are you aware that CPython is the reference implementation of Python the language? If you are introducing language features and semantic changes that CPython doesn't support, or might not support, than it isn't Python, it's a superset (or subset) of the language. Like Cython. Either that, or you are hoping for a revolution where your proposed alternative language becomes so successful that a majority of users abandon CPython, PyPy, Nuitka, Stackless, Jython, IronPython etc and flock to your restricted language, taking the name "Python" with it. I don't think that's very likely. So I don't think you should be thinking about scenarios where other implementations take this up and CPython doesn't. Rather, I think there are three scenarios: 1. Somebody invents a new language with your proposed semantics, similar to other Python-inspired languages as Cython, Boo, Nim and Cobra. (At least one of those, Cython, is a superset of Python; the others are just copying syntax rather than semantics.) 2. CPython takes this up, as an optional extension that other implementations are free to ignore. 3. Or CPython takes this up, and makes it mandatory for any interpreter claiming to be Python to support it. I don't think there is any chance of your proposal being blessed as official Python semantics without CPython supporting it.
This would be scenario number 1 above. A brand new language, inspired by Python, like Cobra. Or since it is backwards-compatible in non-strict mode, perhaps Cython is a better analogy. Or if you prefer, the Perl-inspired Raku. -- Steve

Paul wrote:
(Aside: the literal `1` is not a variable, and variables aren't first-class citizens in Python. We can't bind "a variable" to a name, only the variable's value.) Is monkey-patching disallowed because `mod.func1` is defined as a constant? Or are all functions automatically considered to be constants? If the later, then decorators won't work in strict mode: @decorate def func(): ... is defined as: # define-decorate-replace def func(): ... func = decorate(func) so if functions are automatically const, decorators won't work. Occasionally I find that decorator syntax is not sufficient, and I've used the explicit "define-decorate-replace" form. That won't work either. Is it only *monkey-patching* from outside of the module that is disallowed, or any rebindings to functions, including within the owning module itself? If the later, that's also going to break the common idiom: def func(): # Slow Python version. # maybe replace with fast C version from c_accelerators import * under strict mode.
# Way to define a constant. my_cnst: const = 1
That clashes with type annotations. Constantness is not a type of the value, it's a statement about the name binding. x: int = some_expression is a statement about the permitted values bound to `x`. The type checker can take it that `x` will always be bound to an int, and reason from that. x: const = some_expression tells the type-checker nothing about what type `x` is. Unless it can infer the type of `some_expression`, it could only guess that `x` might be Any type. That's going to hurt type-checking. It would be better to introduce an independent syntax for constants. Let's say `const`: const x: int = some_expression can tell the reader and the type-checker that `x` is an int, even if they can't infer the type from the expression, and tell the compiler that `x` is a constant.
# Leads to a warning: replacing (monkey-patching) a constant slot. my_cnst: const = 2
A warning? What's the use of that? Are your constants *actually constant* or are they just advisory? If they're just advisory, then we might as well stick with the convention to use UPPERCASE names and use a linter that warns if you re-bind a "constant". If you can rebind constants by just suppressing or ignoring the warning, then people will rebind constants. Relevant: Sometimes I run GUI applications from the command line. Invariably they generate a flood of runtime warnings. Clearly developers are paying no attention to warnings.
That seems like a pointless complication. If mod is already imported, why would I need to import it again from inside the function? Just to turn it into a local variable? mylocal = mod Forbidding any imports inside a function would be annoying and frustrating, but at least there's no special cases and exceptions. Forbidding *some* imports, but allowing *unnecessary and redundant* imports inside a function makes no sense to me.
# RuntimeError my_cnst = 3
`my_cnst` is inside a function, so it should create a local variable, not attempt to rebind a global constant.
# RuntimeError mod.func2 = lambda x: 1
Yes you already make it clear that rebinding functions is not allowed.
I know "global variables considered harmful", but this looks to me like punishing users of globals for being bad by restricting what they can do to make their use of globals *worse* rather than better. - all globals must be pre-declared and initialised before use; - functions cannot clean-up after themselves by deleting their unneeded globals. These two restrictions will give the coder annoyance and frustration. What advantage does this provide to make up for that?
# Cheats don't work globals()["new"] = 1
That seems like it will probably break a lot of code, assuming you can even enforce it. Is your proposal for globals() to no longer return the global namespace dict?
So is `fun`. Are you aware that Python's execution model treats: - function *objects* as first-class values, same as ints, strings, floats, lists, etc - and *names* bound to function objects ("functions") as no different from names bound to any other object? You seem to be introducing a complicated execution model, where names bound to functions are different from other names, and functions are not first-class values any longer, but either privileged or restricted, depending on whether you think "functions are values" is a misfeature to be removed or a feature to be cherished.
# fun2 is an alias of fun fun2: const = fun
I don't really understand the purpose of your two modes here. In one mode, the interpreter automatically calls `__main__`; in the other mode, the interpreter runs the standard `if __name__ ...` idiom and then calls `__main__`. How does the interpreter know which mode it is in? Presumably there must be a "strict" modal switch. If that's the case, why not use the presense of that switch to distinguish strict mode from Python mode, instead of this convoluted plan of sometimes automatically calling `__main__` and sometimes not? This mandatory-but-only-sometimes special function seems pointless to me. Just because we're running under strict mode, why should the `if __name__` idiom stop working? It's just code. Your strict mode has to go out of its way to prevent it from working. I'm sure you know that if this proposal goes ahead, people will immediately demand that `__main__` is automatically called in non-strict mode too. So I can't help but feel you are using this as a Trojan Horse to sneak in a stylistic change: replace the `if __name__` idiom for automatically calling a special, magic function. That seems to have zero technical benefit. What if I want to call my application entry point `main()` or `Haupt()` or `entrypoint()`? What if I put the `if __name__` idiom *above* the special magic function? Does it still get magically ignored? if __name__ == '__main__': print('Starting') def __main__(): # magic print('Running') __main__() if __name__ == '__main__': print('Goodbye') I *think* that under your proposal, under regular Python mode, it will print Starting, Running, Goodbye, but under your strict mode, it will print Starting, Running *twice*, and then exit. Maybe. It's not very clear. It could print Running only, and not print Starting or Goodbye at all. Your proposed strict mode doesn't seem to actually be limited to making Python "stricter" for the purposes of optimization or clarity, but also seems to include changes which seems to be more *stylistic* changes which (probably) aren't necessary from the implementation: * globals must be defined in the top level of the module; * global constants override local variables; * functions aren't variables like everything else; * enforced and automatic special entry-point function; * discourage the `if __name__` idiom by making it half redundant; etc. -- Steve

On Wed, Dec 2, 2020 at 10:45 AM Steven D'Aprano <steve@pearwood.info> wrote:
Even though decorator syntax is described as being equivalent to define-decorate-replace, it actually assigns only once. So if the definition of "constant" is "may only be bound once", a decorated function would be fine. (But doing it explicitly wouldn't.) ChrisA

Hello, On Wed, 2 Dec 2020 10:42:25 +1100 Steven D'Aprano <steve@pearwood.info> wrote:
Nobody calls literal "1" a variable. That example explores the meaning of: def func1(): pass func1 = 1 And from a PoV of a human programmer, it's like written in the comment - first, the symbol (name) "func1" is defined as a function, and then it's redefined as a variable.
Is monkey-patching disallowed because `mod.func1` is defined as a constant?
What "disallowed" do you mean? The example above clearly says "Leads to a warning". At "run-time" (i.e. eventually, after you've done the monkey-patching), it's disallowed, yes.
Or are all functions automatically considered to be constants?
That's the case, right.
That's "conceptual model". How it's actually implemented is, "bu abuse of notation" is: def decorate(func): ... Works without hitch with the strict mode.
You'll get a warning, and if you're smarter than the strict mode in that case, you can disable it (in the same sense that you can disable any warning in Python).
In the *run-time* (aka "eventually") any "assign" operation on a const name is disallowed, and additionally adding any new names is disallowed, ditto for deletions. At the *import-time*, everything is allowed.
Will lead to a warning. Got bitten by that myself, yeah. (I llllove "from foo import *"). Had to finally implement __all__ in Pycopy due to those strict mode problems, can you believe? Anyway, re: example above would either need to mend your your ways (e.g. run "import *" first then conditionally define missing funcs), or disable the warning. [Have to stop here, will look at the rest later, thanks for comments.] [] -- Best regards, Paul mailto:pmiscml@gmail.com

On Wed, Dec 02, 2020 at 10:52:30AM +0300, Paul Sokolovsky wrote: [Paul]
[...]
Those two code snippets demonstrate different things. Your original example demonstrates assigning to an module attribute from outside of the module. Your new example demonstrates assigning to a global variable from inside the same module. These call different byte-codes in CPython (STORE_ATTR and STORE_NAME). There is, as far as I know, no way to hook into and prevent STORE_NAME as yet. But STORE_ATTR calls are easy to override: ```
> And from a PoV of a human programmer, it's like written in the comment
> - first, the symbol (name) "func1" is defined as a function, and then
> it's redefined as a variable.
It sounds like you are describing a programmer who doesn't understand
Python's execution model.
Do you understand that "func1" is already a variable? In the sense that
"variables" are name bindings, which is the only sense that applies to
Python. Just because it is bound to a function object doesn't make it
something different -- "func1" is still a name bound to a value, which
is the Python concept of a variable.
Paul, I would have more confidence in your proposal if you described
your proposal in terms of Python's existing execution model. Because so
far, the language you use to describe your proposal sounds like the
language used by somebody who doesn't understand Python's execution
model.
That does not give me confidence in your understanding of the
consequences of these changes, or that you are coming to this proposal
as somebody who loves Python. If you want to program in another language
with radically different semantics, there are hundreds of languages that
aren't Python. You don't have to turn Python into one of them.
> > Is monkey-patching disallowed because `mod.func1` is defined as a
> > constant?
>
> What "disallowed" do you mean? The example above clearly says "Leads to
> a warning". At "run-time" (i.e. eventually, after you've done the
> monkey-patching), it's disallowed, yes.
What disallowed do I mean? Exactly the same disallowed you just agreed
with.
> > Or are all functions automatically considered to be
> > constants?
>
> That's the case, right.
Okay, so all functions are automatically constants.
[regarding decorators]
> That's "conceptual model". How it's actually implemented is, "bu abuse
> of notation" is:
>
> def decorate(func):
> ...
>
> Works without hitch with the strict mode.
Okay, Chris made the same point.
Is that a language guarantee or a quirk of the implementation?
For this to be guarenteed to work requires the implementation of
decorators to be guaranteed by the language. If it is not already a
guarantee, it will have to be made one.
(For the record, that is not a point against your proposal, just an
observation.)
[...]
> > Occasionally I find that decorator syntax is not sufficient, and I've
> > used the explicit "define-decorate-replace" form. That won't work
> > either.
>
> You'll get a warning, and if you're smarter than the strict mode in
> that case, you can disable it (in the same sense that you can disable
> any warning in Python).
So "strict mode" isn't actually strict, it's just a built-in linter.
--
Steve

On Wed, Dec 2, 2020 at 9:18 PM Steven D'Aprano <steve@pearwood.info> wrote:
Yes, it's part of the original PEP: https://www.python.org/dev/peps/pep-0318/#current-syntax "without the intermediate assignment to the variable func" It's not often significant, but when it is, it's definitely the better way to do it. For one thing, it simplifies the understanding of idioms like this: @spam.setter def spam(self, newvalue): ... And it's not something I do often, but sometimes I've built a decorator that actually looks at the previous binding of a name. Something like this (hypothetical): @validator def frobnicate(thing): if not frobbable: raise Exception When called, the final frobnicate function would check the validator, and if it doesn't error out, would call the original function, whatever that had been. To do that, the validator() decorator function has to be able to see what was previously bound to that name. ChrisA

Hello, On Wed, 2 Dec 2020 21:15:22 +1100 Steven D'Aprano <steve@pearwood.info> wrote:
Nice language-lawyering pick! Behold my counter: Generally, the background idea of this proposal is to treat Python as "a generic programming language", and not to go to deep into (CPython) jargon and implementation details. From that point of view, those two things demonstrate very similar things: in 1st case, "func1" is assigned in another module's namespace, in 2nd case, "func1" is assigned in the current module's namespace. Implementation details you raise are insightful, but aren't needed for audience to understand those snippets. Generally, Steven, I know that you're among the people who could find *real* issues with my proposal. So, I look forward to, if you would so like, us to finish with this language-lawyering preamble, and get to the substance of the proposal.
There is, as far as I know, no way to hook into and prevent STORE_NAME as yet.
And yet I'm doing just that. You just didn't reach yet in your reading the section of https://github.com/pycopy/PycoEPs/blob/master/StrictMode.md#implementation . You'd immediately remember that we've got STORE_NAME covered. STORE_GLOBAL is a culprit however. []
No, I'm describing a programmer who understands Python's execution model well enough, and "fed up" with some parts of it, and looking forward to some adjusted execution models to expand their horizons. The whole point of the proposal is to introduce a new execution model, I hope the beginning of the proposal, "This proposal seeks to introduce opt-in "strict execution mode"" rings enough of a bell. I'm sorry if it didn't.
Do you understand that "func1" is already a variable?
Not only I understand that, that's exactly what I seek to mend. Generally, Steven, let me draw a picture. For a moment, imagine that we both know Python execution model. Then: You: Look at it and... like what you see. I: Look at it and... spotting problematic things to adjust.
That could be a fair point. But let me ask a fair question: what percentage of the full proposal have you read? Because if it's sufficiently less than "entire", than your premonitions could be premature. If anything, I'm glad I've posted the "TL;DR" version, because the real proposal has got exactly 0 responses so far, while so many people were already kind to kick at the direction of the "TL;DR". (And given that I explicitly said I'm going to post this proposal "for beating", kicks are exactly the expected outcome.)
Love can be hard, too.
But that's a part of meta-motivation of the most things I do to Python, sorry for not keeping you in loop! So, as someone who was long time with Python and knows it pretty well, I also know many of its issues. And I was looking for alternatives. And I found out that almost any other reasonable choice would give me more static/strict language nature. E.g., among the common crowd of Ruby/JavaScript/Lua, Python is the strongest-typed alternative (while being dynamically typed). And heck, I have to admit, I like that, so I don't downgrade to them. And the rest of crowd, are rather too static to be practical for human-scale computing (not touching on problems of syntax ugliness, bloat, and concerns of being advertisement product of a media company). So, I decided to make a contrarian move: sit on the things I like in Python, and work on changing things I don't like.
You don't have to turn Python into one of them.
Believe it or not, but the proposal is so cute, that "addresses" even that point. It's at the very end, titled "Trivia". I'll let you scroll to it at your pace. But just in case, it's along the lines of: "Python was kidnapped by the secret Smalltalk cult! Freedom to Python!" ;-) [the rest later] -- Best regards, Paul mailto:pmiscml@gmail.com

Hello, On Wed, 2 Dec 2020 21:15:22 +1100 Steven D'Aprano <steve@pearwood.info> wrote: [First part was answered previously.]
What's allowed and what's disallowed depends on whether a program just "loads" or "actually starts to run". While it loads, it can do all the tricks Python can do now. When it starts to run, then the actual "strict" restrictions are applied. That's the crucial part of the proposal, required to understand other aspects of the proposal.
Yes, that's the whole idea. []
It's strict. And it has many (more than one for sure) uses. One of the uses is to improve code structure (or understanding of it) of the existing codebases.
-- Steve
-- Best regards, Paul mailto:pmiscml@gmail.com

Hello, (As the thread continues, and goes from "why did you do that" to actual technical details/implication, I have little choice (if to maintain coherent discussion) but go thru existing posts looking into technical aspects of the proposal.) On Wed, 2 Dec 2020 10:42:25 +1100 Steven D'Aprano <steve@pearwood.info> wrote: [First part was answered previously.]
"const" is a *type annotation*.
Constantness is not a type of the value, it's a statement about the name binding.
Your interpretation, my interpretation: 'const is a type annotation like any other'. This poses a question of type annotation composability. Outside the scope of this discussion (a solution exists). All this was mentioned already in previous discussions. []
Yes, and that was mentioned already. But we'll do that only when we play enough with "const" as an annotation and see that it's so useful that is worth being promoted to a keyword. (I'm sure we'll end up there; but the current state is that not everyone is convinced of useful of "const" at all. So, no hurry, step by step.) []
A use of warning in programming language? A generic question outside the scope of the current discussion.
Are your constants *actually constant* or are they just advisory?
I already answered how my proposal works in this thread. And it's fully described in the proposal. So, please refer to it.
Maybe you can stick with UPPERCASE names. I'd prefer something more generic.
If you can rebind constants by just suppressing or ignoring the warning, then people will rebind constants.
If they no what they're doing, let them do that (while they can, until the "run-time" phase starts.)
Outside the scope of the discussion. E.g. C compilers have -Werror for that.
You missed the point, it's the opposite situation: "mod" is imported inside the function to start with. The rest of the discussion is how to accommodate that in the strict mode, given that "on the surface", that's prohibited in the strict mode. []
Good catch. "global my_cnst" was clearly missing before that. Fixed in https://github.com/pycopy/PycoEPs/blob/master/StrictModeTLDR.md
No, you missed the critical part of the proposal: that there're 2 phases of the execution of a Python program: "import-time" and "run-time". Things at the global scope of that sample show rules for import-time (lax, even const's can be redefined), while code inside function executes at "run-time" (strict in all its full glory). The information about 2 distinct execution phases is presented clearly even in the TLDR version. Thanks for missing it.
They won't be punished much more than the already existing "considered harmful" background establishes ;-).
- all globals must be pre-declared and initialised before use;
Already the best practice.
- functions cannot clean-up after themselves by deleting their unneeded globals.
Modules can. Functions? Never saw that, and that's clearly "not Pythonic". (But one of "dirty Python tricks".) []
The code which will be broken - it won't be able to run in the strict mode. It either will need to be fixed to "not do dirty Python tricks", or be left forever to work in "standard" mode.
Is your proposal for globals() to no longer return the global namespace dict?
At run-time, it would be wrapped in a read-only proxy. That's all explained in the proposal. Sorry, did you read it?
In with the strict mode.
Are you aware that Python's execution model treats:
I'm aware, yes. Are you aware that the strict mode proposal seeks to change that?
You seem to be introducing a complicated execution model,
Not *that* complicated is you think about it.
No, the model is different: function bindings are by default immutable bindings, which is the best practice for most languages. Only when proven otherwise, a function binding is promoted (or demoted, depending on your outlook) to a mutable binding.
Well, that's because one mode already exists, and another I'm adding in the proposal.
How does the interpreter know which mode it is in?
Described in the proposal.
Described in the proposal.
Described in the proposal.
What the people may demand some time later is outside the scope of the proposal.
I do not. For the people that you mention above my response is "no". (Aka: use the strict mode if you want to get tasty goodies, or stick with the old stuff.)
Strict mode doesn't change statement-by-statement sequential execution of the Python programs. Nor there's any magic in it. All is explained in the proposal.
No, it will print Running twice and nothing else.
That's true, and fully disclosed in the proposal (at the very beginning): "However, it is also believed that the "strict mode" may also help on its own with clarity and maintenance of large(r) Python codebases."
No, it must be presented for compatibility.
[] -- Best regards, Paul mailto:pmiscml@gmail.com
participants (11)
-
Brendan Barnwell
-
Chris Angelico
-
Christopher Barker
-
David Mertz
-
Felipe Rodrigues
-
Marco Sulla
-
Paul Moore
-
Paul Sokolovsky
-
Ricky Teachey
-
Stephen J. Turnbull
-
Steven D'Aprano