Paul wrote:
Below is an example of strict mode program with comments explaining various features:
``` import mod
# Leads to a warning: replacing (monkey-patching) a constant slot (function) with a variable. mod.func1 = 1
(Aside: the literal `1` is not a variable, and variables aren't first-class citizens in Python. We can't bind "a variable" to a name, only the variable's value.) Is monkey-patching disallowed because `mod.func1` is defined as a constant? Or are all functions automatically considered to be constants? If the later, then decorators won't work in strict mode: @decorate def func(): ... is defined as: # define-decorate-replace def func(): ... func = decorate(func) so if functions are automatically const, decorators won't work. Occasionally I find that decorator syntax is not sufficient, and I've used the explicit "define-decorate-replace" form. That won't work either. Is it only *monkey-patching* from outside of the module that is disallowed, or any rebindings to functions, including within the owning module itself? If the later, that's also going to break the common idiom: def func(): # Slow Python version. # maybe replace with fast C version from c_accelerators import * under strict mode.
# Way to define a constant. my_cnst: const = 1
That clashes with type annotations. Constantness is not a type of the value, it's a statement about the name binding. x: int = some_expression is a statement about the permitted values bound to `x`. The type checker can take it that `x` will always be bound to an int, and reason from that. x: const = some_expression tells the type-checker nothing about what type `x` is. Unless it can infer the type of `some_expression`, it could only guess that `x` might be Any type. That's going to hurt type-checking. It would be better to introduce an independent syntax for constants. Let's say `const`: const x: int = some_expression can tell the reader and the type-checker that `x` is an int, even if they can't infer the type from the expression, and tell the compiler that `x` is a constant.
# Leads to a warning: replacing (monkey-patching) a constant slot. my_cnst: const = 2
A warning? What's the use of that? Are your constants *actually constant* or are they just advisory? If they're just advisory, then we might as well stick with the convention to use UPPERCASE names and use a linter that warns if you re-bind a "constant". If you can rebind constants by just suppressing or ignoring the warning, then people will rebind constants. Relevant: Sometimes I run GUI applications from the command line. Invariably they generate a flood of runtime warnings. Clearly developers are paying no attention to warnings.
def fun(): # Imports are not allowed at run-time import mod2 # But you can re-import module previously imported at import-time. import mod
That seems like a pointless complication. If mod is already imported, why would I need to import it again from inside the function? Just to turn it into a local variable? mylocal = mod Forbidding any imports inside a function would be annoying and frustrating, but at least there's no special cases and exceptions. Forbidding *some* imports, but allowing *unnecessary and redundant* imports inside a function makes no sense to me.
# RuntimeError my_cnst = 3
`my_cnst` is inside a function, so it should create a local variable, not attempt to rebind a global constant.
# RuntimeError mod.func2 = lambda x: 1
Yes you already make it clear that rebinding functions is not allowed.
global glb1, new # RuntimeError: Cannot create new global nameslots at runtime. new = 1 # Nor can delete existing del glb1
I know "global variables considered harmful", but this looks to me like punishing users of globals for being bad by restricting what they can do to make their use of globals *worse* rather than better. - all globals must be pre-declared and initialised before use; - functions cannot clean-up after themselves by deleting their unneeded globals. These two restrictions will give the coder annoyance and frustration. What advantage does this provide to make up for that?
# Cheats don't work globals()["new"] = 1
That seems like it will probably break a lot of code, assuming you can even enforce it. Is your proposal for globals() to no longer return the global namespace dict?
# Leads to a warning: replacing (monkey-patching) a constant slot (function). def fun(): pass
# fun_var is a variable storing a reference to a function (can store ref # to another func). fun_var = fun
So is `fun`. Are you aware that Python's execution model treats: - function *objects* as first-class values, same as ints, strings, floats, lists, etc - and *names* bound to function objects ("functions") as no different from names bound to any other object? You seem to be introducing a complicated execution model, where names bound to functions are different from other names, and functions are not first-class values any longer, but either privileged or restricted, depending on whether you think "functions are values" is a misfeature to be removed or a feature to be cherished.
# fun2 is an alias of fun fun2: const = fun
# Run-time execution starts with this function. This clearly delineates # import-time from run-time: a module top-level code is executed at # import-time (including import statements, which execute top-level code # of other modules recursively). When that is complete, strict mode # interpreter switches to run-time mode (restrictions enabled) and # executes __main__(). def __main__(): fun()
# This statement is not executed when program runs in strict mode. # It is executed when it is run in normal mode, and allow to have # the same startup sequence (execution of __main__()) for both cases. if __name__ == "__main__": __main__() ```
I don't really understand the purpose of your two modes here. In one mode, the interpreter automatically calls `__main__`; in the other mode, the interpreter runs the standard `if __name__ ...` idiom and then calls `__main__`. How does the interpreter know which mode it is in? Presumably there must be a "strict" modal switch. If that's the case, why not use the presense of that switch to distinguish strict mode from Python mode, instead of this convoluted plan of sometimes automatically calling `__main__` and sometimes not? This mandatory-but-only-sometimes special function seems pointless to me. Just because we're running under strict mode, why should the `if __name__` idiom stop working? It's just code. Your strict mode has to go out of its way to prevent it from working. I'm sure you know that if this proposal goes ahead, people will immediately demand that `__main__` is automatically called in non-strict mode too. So I can't help but feel you are using this as a Trojan Horse to sneak in a stylistic change: replace the `if __name__` idiom for automatically calling a special, magic function. That seems to have zero technical benefit. What if I want to call my application entry point `main()` or `Haupt()` or `entrypoint()`? What if I put the `if __name__` idiom *above* the special magic function? Does it still get magically ignored? if __name__ == '__main__': print('Starting') def __main__(): # magic print('Running') __main__() if __name__ == '__main__': print('Goodbye') I *think* that under your proposal, under regular Python mode, it will print Starting, Running, Goodbye, but under your strict mode, it will print Starting, Running *twice*, and then exit. Maybe. It's not very clear. It could print Running only, and not print Starting or Goodbye at all. Your proposed strict mode doesn't seem to actually be limited to making Python "stricter" for the purposes of optimization or clarity, but also seems to include changes which seems to be more *stylistic* changes which (probably) aren't necessary from the implementation: * globals must be defined in the top level of the module; * global constants override local variables; * functions aren't variables like everything else; * enforced and automatic special entry-point function; * discourage the `if __name__` idiom by making it half redundant; etc. -- Steve