Crazy idea: allow keywords as names in certain positions

As anyone still following the inline assignment discussion knows, a problem with designing new syntax is that it's hard to introduce new keywords into the language, since all the nice words seem to be used as method names in popular packages. (E.g. we can't use 'where' because there's numpy.where <https://docs.scipy.org/doc/numpy-1.14.0/reference/generated/numpy.where.html>, and we can't use 'given' because it's used in Hypothesis <http://hypothesis.readthedocs.io/en/latest/quickstart.html>.) The idea I had (not for the first time :-) is that in many syntactic positions we could just treat keywords as names, and that would free up these keywords. For example, we could allow keywords after 'def' and after a period, and then the following would become legal: class C: def and(self, other): return ... a = C() b = C() print(a.and(b)) This does not create syntactic ambiguities because after 'def' and after a period the grammar *always* requires a NAME. There are other positions where we could perhaps allow this, e.g. in a decorator, immediately after '@' (the only keyword that's *syntactically* legal here is 'not', though I'm not sure it would ever be useful). Of course this would still not help for names of functions that might be imported directly (do people write 'from numpy import where'?). And it would probably cause certain typos be harder to diagnose. I should also mention that this was inspired from some messages where Tim Peters berated the fashion of using "reserved words", waxing nostalgically about the old days of Fortran (sorry, FORTRAN), which doesn't (didn't?) have reserved words at all (nor significant whitespace, apart from the "start in column 7" rule). Anyway, just throwing this out. Please tear it apart! -- --Guido van Rossum (python.org/~guido)

I like it! The obvious question, though: How would "*from package import keyword*" be handled, if not simply by SyntaxError? Would *from package import keyword as keyword_* be allowed? In a similar vein, what would happen with stdlib functions like operator.not_? The thought of writing "operator.not" is appealing, but being forced to use *that* (with *from operator import not* being non-allowable) may not be. On Sun, May 13, 2018, 11:20 AM Guido van Rossum <guido@python.org> wrote:

Elias Tarhini wrote:
Under the proposal I made in my last post, "from operator import not" would be fine -- you just wouldn't be able to use the "not" operator anywhere in the module then. :-) A more nuanced version would have the effect restricted to the scope the import appears in, so that you could write def f(): from operator import not # 'not' is now an ordinary name inside this # function, but it's business as usual elsewhere Someone else can figure out how to make the parser handle this, though. :-) -- Greg

On Sun, May 13, 2018 at 11:20 AM Guido van Rossum <guido@python.org> wrote:
For example, we could allow keywords after 'def' and after a period, and then the following would become legal:
Our modeling database overloads getattr/setattr (think SQLAlchemy) to allow us to access to database fields as if they were Python data members. Nothing new here, but we do have problems with keyword collisions on some of the objects, as we are wrapping an already-existing modeling language (MSC Adams Solver dataset) with our objects. We were pleased with 'print' became a function, because it removed the restriction from that one, but one of the remaining ones is 'return', like this: class Sensor: def __init__(self): setattr(self, "print", 0) setattr(self, "return", 0) s = Sensor() s.print # Works now, didn't in Python 2. s.return # Bork. I have a decades old +1 on this.

On Mon, May 14, 2018 at 4:19 AM, Guido van Rossum <guido@python.org> wrote:
I spent most of the 1990s coding in REXX, which has exactly zero reserved words. You can write code like this: if = 1 then = "spam" else = "ham" if if then then else else do = 5 do do print("Doobee doobee doo" end The problem is that you can go a long way down the road of using a particular name, only to find that suddenly you can't use it in some particular context. Common words like "if" and "do" are basically never going to get reused (so there's no benefit over having actual keywords), but with less common words (which would include the proposed "where" for binding expressions), it's entirely possible to get badly bitten. So the question is: Is it better to be able to use a keyword as an identifier for a while, and then run into trouble later, or would you prefer to be told straight away "no, sorry, pick a different name"? ChrisA

Apologies for my initial response. Looks like I failed to expand the initial email fully, which would have shown me the following :)
Of course this would still not help for names of functions that might be imported directly (do people write 'from numpy import where'?).
-- I do think the *import keyword as keyword_* concept has merit, however. (This would also be a simple retroactive solution to the asyncio.async problem, wouldn't it?) On Sun, May 13, 2018, 11:45 AM Chris Angelico <rosuav@gmail.com> wrote:

On 2018-05-13 11:19, Guido van Rossum wrote:
Spooky! :-) People definitely do write "from numpy import where". You can see examples just by googling for that string. And that's only finding the cases where "where" is the first thing in the from clause; there are probably many more where it's something like "from numpy import array, sin, cos, where". I think this kind of special casing would be really confusing though. It would mean that these two would work (and do the same thing): import np np.array(...) from np import array array(...) And this would work: import np np.where(...) But then this would fail: from np import where where(...) That would be really puzzling. Plus, we'd still have the problem of backwards compatibility. Before the new "where" keyword was introduced, the last example WOULD work, but then afterwards you'd have to change it to look like the third example. This might induce wary programmers to hide all names behind a period (that is, find a way to do "foo.bar" instead of just "bar" whenever possible) in order to guard against future keywordization of those names. The other thing is that I suspect use of such a privilege would explode due to the attractiveness of the reserved words. That is, many libraries would start defining things named "and", "or", "with", "is", "in", etc., because the names are so nice and short and are useful in so many situations. So there'd be nowhere to hide from the deluge of names shadowing keywords. -- Brendan Barnwell "Do not follow where the path may lead. Go, instead, where there is no path, and leave a trail." --author unknown

[Guido]
The Fortran standards continued evolving, and still are, but despite that in many ways it's become a much more "modern" language, while it continues to add ever more keywords it _still_ has no reserved words. They have a strong reason for that: there are numeric libraries coded decades ago in Fortran still in use, which nobody understands anymore, so anything that breaks old code is fiercely opposed. If you think people whine today about an incompatible Python change, wait another 30+ years to get a sense of what it's like in the Fortran community ;-) I'm not sure people realize that Algol had no reserved words either. But that was "a trick" :-) Algol was originally designed as a formal notation for publishing algorithms, not really for executing them. Published Algol "code" always showed the language's keywords in boldface.. Similarly, the nearly incomprehensible (to me) Algol 68's 60 "reserved words" are essentially defined to be in boldface in the "representation" (for publication) language, but can also be used as identifiers etc. The intimately related but distinct "reference language" needs to be used for computer input. In that the reserved words are decorated in some way, to distinguish them from user-defined names of the same spelling. There's more than one way to do that, and - indeed - it's required that Algol 68 compilers support several specific ways. Anyone looking for more pain can get it here: https://en.wikipedia.org/wiki/ALGOL_68#Program_representation Note that Perl largely (but not entirely) managed to dodge the problem by requiring a punctuation character (like $ for scalar and @ for array) before variable names. In addition to REXX (already mentioned), Prolog and PL/I had no reserved words. End of brain dump ;-) But I didn't suggest that for Python. It's a bit too late for that anyway ;-) It was more in line with what you suggested here: think more about ways in which reserved words _could_ be allowed as identifiers in specific contexts where implementing that doesn't require heroic effort.

On 13/05/2018 19:19, Guido van Rossum wrote:
This would not prevent code breakage when a new keyword was added. It would only reduce the amount of code broken. Although in my unsubstantiated opinion not by very much; I suspect that most of the time an identifier is used in a module, it is used at least once in contexts where it would still be a SyntaxError if it were a keyword. Rob Cliffe

Guido van Rossum wrote:
Of course this would still not help for names of functions that might be imported directly (do people write 'from numpy import where'?).
Maybe things could be rigged so that if you use a reserved word as a name in an import statement, it's treated as a name everywhere else in that module. Then "from numpy import where" would Just Work. -- Greg

On Sun, May 13, 2018 at 9:00 PM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
'from numpy import *' is also common. -n -- Nathaniel J. Smith -- https://vorpus.org

As long as any new syntax allowed us to still reference the occassional thing in the older libraries that used newly reserved names, it would not come up that often, and would avoid the biggest cost of a whole new version. If `foo` was a reserved word, then this could be allowed... import foo as bar ...but not these... import foo import bar as foo The same could be done with params, so this would be illegal... def f(foo): ... ...but this would be fine... f(foo=1) It would be illegal to define a property named `foo`, but you could still do `bar.foo` to use a library, etc. It could be done, but it's not especially relevant here, so I'll shut up now. -- Carl Smith carl.input@gmail.com On 14 May 2018 at 03:47, Rob Cliffe via Python-ideas < python-ideas@python.org> wrote:

On 13 May 2018 at 14:19, Guido van Rossum <guido@python.org> wrote:
While I think the "restricted use" idea would be confusing, I do like the idea of separating out "postfix keywords", which can't start a statement or expression, and hence can be used *unambiguously* as names everywhere that names are allowed. Adding such a capability is essential to proposing a keyword based approach to inline assignments, and would technically also allow "and", "or", "is", and "as" to be freed up for use as names. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On 13/05/18 19:19, Guido van Rossum wrote:
I'm not familiar with the innards of the parser and it's waaaay too long since I sat through a parsing course, but can you turn that inside out? Are there times when the compiler knows it must be looking at a keyword, not a name? I suspect not, given that arbitrary expressions can be statements, but someone else may have a more knowledgeable opinion. -- Rhodri James *-* Kynesim Ltd

On 2018-05-14 12:35, Rhodri James wrote:
Sure are. For example, just after `1 `, a name would be an error, while a keyword would be fine. But, more to the point, there are times when the parser knows (or could know) it can't be looking at a certain keyword. Suppose you want to parse at the beginning of an expression when you see "and". Currently, you're duty-bound to explode, because "and" cannot begin an expression. You *could* equally well not explode and, knowing that "and" cannot begin an expression, interpret it completely unambiguously as a name.

On 5/13/2018 2:19 PM, Guido van Rossum wrote:
This trades the simplicity of 'using a keyword as an identifier always fails immediately' for the flexibility of 'using a keyword as an identifier may work, or may fail later'. The failure would be predictable in the sense of being being deterministic for any particular code string but unpredictable in the sense of not knowing what code would be written. It would also be unpredictable when people use a keyword as variable without knowing that they had done so. The proposal reminds me of the current tkinter situation when using tcl/tk compiled without thread support: calling a tkinter function (method) from worker threads may work, at least initially, but may eventually fail. The main problem here is that this is not documented. But I suspect that if and when it is (https://bugs.python.org/issue33479), most will 'not do that' and only a few intrepid souls will do the necessary experiments. The impact on real-time syntax coloring should be considered. An re-based colorizer, like IDLE's, tags every appearance of keywords outside of strings and comments, syntactically correct or not. For instance, the first two 'and's in "a and b.and # and error" are tagged. To not tag the second would require full parsing, presumably by compiling to AST, and more support code. I suspect this would would make coloring slower. Whether too slow for existing machines, I don't know. -- Terry Jan Reedy

[Terry Reedy <tjreedy@udel.edu>]
I can pretty much guarantee it would be slower - way slower. But regexps could still do the bulk of it, provided the contexts remain as simple-minded as the specific ones Guido suggested: For example, we could allow keywords after 'def' and after a period, So when the IDLE colorizer regexp finds a match with the "KEYWORD" group name, don't color it at once. Instead look back from the start of the _possible_ keyword match, to see whether it's preceded by period maybe_whitespace or "def"-starting-at-a-word-boundary at_least_one_whitespace and if so skip coloring. There are many ways to code that, but - ya - they're annoying. Alas, the re module's negative lookbehind assertion isn't powerful enough for it, else those checks could be added directly to the keyword part of the colorizing regexp. I expect (but don't know) that lookbehinds in MRAB's "regex" module are strong enough for it. Of course that's just IDLE. It was remarkably easy to teach my primary editor (Source Insight) all sorts of stuff about .py files, but its "keywords" subsystem is restricted to specifying literal strings to look for. The kinds of tricks above probably require that the tool have access to a real programming language (like IDLE and Emacs enjoy).

Terry Reedy wrote:
the first two 'and's in "a and b.and # and error" are tagged. To not tag the second would require full parsing,
That particular case could be handled by just not colouring any word following a dot. Some other situations might result in false positives, but there are cases like that already. Many Python colourisers colour any occurrence of a builtin function name, even if it has been shadowed, and people seem to live with that. It might even be useful, if it alerts people that a name they're using is reserved in some contexts, and so might be best avoided. -- Greg

On 5/14/2018 6:58 PM, Greg Ewing wrote:
OK, more parsing, even if not complete. Other cases require looking at several other things before or after, depending on the exact proposal. Okay, more parsing, depending on the exact rules. Some of the variations (
For IDLE:
def int(): return 0 # int colored as defined name
int() # int colored as builtin. 0
Perhaps I should add a check whether defined names are builtins, and if so, color them as builtins even after 'def' and 'class'.
Terry Jan Reedy

I'm not half aware of the way everything works under the hood as the vast majority here, but I have an idea. It possibly is technically stupid when you really know how things work, but as I'm not sure, I'm submitting it anyways: Why not add the compatibility done at setup.py's level? Like having something like this: setup( name="LegacyLib", version="8.4.1", ... max_compatibility="3.4" # Defining here Python's max version for which this lib has been upgraded ) Of course, we may use any other word instead of "max_compatibility", like "designed_for", "python_version", or anything a better English speaker could think of. The point is, it would either: - when you install the library, rename all variables that are now keywords (we'd know the exact list thanks to max_compatiblity) by suffixing them with "_" - or set a flag that will do that when creating the *.pyc files. Possible problems/limitations I can already find: - There would still be possible errors when using variable names that are generated on the fly (I have no clue how this could ever be addressed) - It might get complicated at some point to know what to do, like when we have lib_a in some version depending on lib_b (with or without a max_compatibility version number), it is obvious that lib_a will use lib_b's original variable names (without the appended "_"), but our code which might also want to interact with lib_b would have to. Is it plain stupid? Are there lots of things I didn't think of? Could it be a possibility? -Brice Le 13/05/2018 à 20:19, Guido van Rossum a écrit :

I like it! The obvious question, though: How would "*from package import keyword*" be handled, if not simply by SyntaxError? Would *from package import keyword as keyword_* be allowed? In a similar vein, what would happen with stdlib functions like operator.not_? The thought of writing "operator.not" is appealing, but being forced to use *that* (with *from operator import not* being non-allowable) may not be. On Sun, May 13, 2018, 11:20 AM Guido van Rossum <guido@python.org> wrote:

Elias Tarhini wrote:
Under the proposal I made in my last post, "from operator import not" would be fine -- you just wouldn't be able to use the "not" operator anywhere in the module then. :-) A more nuanced version would have the effect restricted to the scope the import appears in, so that you could write def f(): from operator import not # 'not' is now an ordinary name inside this # function, but it's business as usual elsewhere Someone else can figure out how to make the parser handle this, though. :-) -- Greg

On Sun, May 13, 2018 at 11:20 AM Guido van Rossum <guido@python.org> wrote:
For example, we could allow keywords after 'def' and after a period, and then the following would become legal:
Our modeling database overloads getattr/setattr (think SQLAlchemy) to allow us to access to database fields as if they were Python data members. Nothing new here, but we do have problems with keyword collisions on some of the objects, as we are wrapping an already-existing modeling language (MSC Adams Solver dataset) with our objects. We were pleased with 'print' became a function, because it removed the restriction from that one, but one of the remaining ones is 'return', like this: class Sensor: def __init__(self): setattr(self, "print", 0) setattr(self, "return", 0) s = Sensor() s.print # Works now, didn't in Python 2. s.return # Bork. I have a decades old +1 on this.

On Mon, May 14, 2018 at 4:19 AM, Guido van Rossum <guido@python.org> wrote:
I spent most of the 1990s coding in REXX, which has exactly zero reserved words. You can write code like this: if = 1 then = "spam" else = "ham" if if then then else else do = 5 do do print("Doobee doobee doo" end The problem is that you can go a long way down the road of using a particular name, only to find that suddenly you can't use it in some particular context. Common words like "if" and "do" are basically never going to get reused (so there's no benefit over having actual keywords), but with less common words (which would include the proposed "where" for binding expressions), it's entirely possible to get badly bitten. So the question is: Is it better to be able to use a keyword as an identifier for a while, and then run into trouble later, or would you prefer to be told straight away "no, sorry, pick a different name"? ChrisA

Apologies for my initial response. Looks like I failed to expand the initial email fully, which would have shown me the following :)
Of course this would still not help for names of functions that might be imported directly (do people write 'from numpy import where'?).
-- I do think the *import keyword as keyword_* concept has merit, however. (This would also be a simple retroactive solution to the asyncio.async problem, wouldn't it?) On Sun, May 13, 2018, 11:45 AM Chris Angelico <rosuav@gmail.com> wrote:

On 2018-05-13 11:19, Guido van Rossum wrote:
Spooky! :-) People definitely do write "from numpy import where". You can see examples just by googling for that string. And that's only finding the cases where "where" is the first thing in the from clause; there are probably many more where it's something like "from numpy import array, sin, cos, where". I think this kind of special casing would be really confusing though. It would mean that these two would work (and do the same thing): import np np.array(...) from np import array array(...) And this would work: import np np.where(...) But then this would fail: from np import where where(...) That would be really puzzling. Plus, we'd still have the problem of backwards compatibility. Before the new "where" keyword was introduced, the last example WOULD work, but then afterwards you'd have to change it to look like the third example. This might induce wary programmers to hide all names behind a period (that is, find a way to do "foo.bar" instead of just "bar" whenever possible) in order to guard against future keywordization of those names. The other thing is that I suspect use of such a privilege would explode due to the attractiveness of the reserved words. That is, many libraries would start defining things named "and", "or", "with", "is", "in", etc., because the names are so nice and short and are useful in so many situations. So there'd be nowhere to hide from the deluge of names shadowing keywords. -- Brendan Barnwell "Do not follow where the path may lead. Go, instead, where there is no path, and leave a trail." --author unknown

[Guido]
The Fortran standards continued evolving, and still are, but despite that in many ways it's become a much more "modern" language, while it continues to add ever more keywords it _still_ has no reserved words. They have a strong reason for that: there are numeric libraries coded decades ago in Fortran still in use, which nobody understands anymore, so anything that breaks old code is fiercely opposed. If you think people whine today about an incompatible Python change, wait another 30+ years to get a sense of what it's like in the Fortran community ;-) I'm not sure people realize that Algol had no reserved words either. But that was "a trick" :-) Algol was originally designed as a formal notation for publishing algorithms, not really for executing them. Published Algol "code" always showed the language's keywords in boldface.. Similarly, the nearly incomprehensible (to me) Algol 68's 60 "reserved words" are essentially defined to be in boldface in the "representation" (for publication) language, but can also be used as identifiers etc. The intimately related but distinct "reference language" needs to be used for computer input. In that the reserved words are decorated in some way, to distinguish them from user-defined names of the same spelling. There's more than one way to do that, and - indeed - it's required that Algol 68 compilers support several specific ways. Anyone looking for more pain can get it here: https://en.wikipedia.org/wiki/ALGOL_68#Program_representation Note that Perl largely (but not entirely) managed to dodge the problem by requiring a punctuation character (like $ for scalar and @ for array) before variable names. In addition to REXX (already mentioned), Prolog and PL/I had no reserved words. End of brain dump ;-) But I didn't suggest that for Python. It's a bit too late for that anyway ;-) It was more in line with what you suggested here: think more about ways in which reserved words _could_ be allowed as identifiers in specific contexts where implementing that doesn't require heroic effort.

On 13/05/2018 19:19, Guido van Rossum wrote:
This would not prevent code breakage when a new keyword was added. It would only reduce the amount of code broken. Although in my unsubstantiated opinion not by very much; I suspect that most of the time an identifier is used in a module, it is used at least once in contexts where it would still be a SyntaxError if it were a keyword. Rob Cliffe

Guido van Rossum wrote:
Of course this would still not help for names of functions that might be imported directly (do people write 'from numpy import where'?).
Maybe things could be rigged so that if you use a reserved word as a name in an import statement, it's treated as a name everywhere else in that module. Then "from numpy import where" would Just Work. -- Greg

On Sun, May 13, 2018 at 9:00 PM, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
'from numpy import *' is also common. -n -- Nathaniel J. Smith -- https://vorpus.org

As long as any new syntax allowed us to still reference the occassional thing in the older libraries that used newly reserved names, it would not come up that often, and would avoid the biggest cost of a whole new version. If `foo` was a reserved word, then this could be allowed... import foo as bar ...but not these... import foo import bar as foo The same could be done with params, so this would be illegal... def f(foo): ... ...but this would be fine... f(foo=1) It would be illegal to define a property named `foo`, but you could still do `bar.foo` to use a library, etc. It could be done, but it's not especially relevant here, so I'll shut up now. -- Carl Smith carl.input@gmail.com On 14 May 2018 at 03:47, Rob Cliffe via Python-ideas < python-ideas@python.org> wrote:

On 13 May 2018 at 14:19, Guido van Rossum <guido@python.org> wrote:
While I think the "restricted use" idea would be confusing, I do like the idea of separating out "postfix keywords", which can't start a statement or expression, and hence can be used *unambiguously* as names everywhere that names are allowed. Adding such a capability is essential to proposing a keyword based approach to inline assignments, and would technically also allow "and", "or", "is", and "as" to be freed up for use as names. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On 13/05/18 19:19, Guido van Rossum wrote:
I'm not familiar with the innards of the parser and it's waaaay too long since I sat through a parsing course, but can you turn that inside out? Are there times when the compiler knows it must be looking at a keyword, not a name? I suspect not, given that arbitrary expressions can be statements, but someone else may have a more knowledgeable opinion. -- Rhodri James *-* Kynesim Ltd

On 2018-05-14 12:35, Rhodri James wrote:
Sure are. For example, just after `1 `, a name would be an error, while a keyword would be fine. But, more to the point, there are times when the parser knows (or could know) it can't be looking at a certain keyword. Suppose you want to parse at the beginning of an expression when you see "and". Currently, you're duty-bound to explode, because "and" cannot begin an expression. You *could* equally well not explode and, knowing that "and" cannot begin an expression, interpret it completely unambiguously as a name.

On 5/13/2018 2:19 PM, Guido van Rossum wrote:
This trades the simplicity of 'using a keyword as an identifier always fails immediately' for the flexibility of 'using a keyword as an identifier may work, or may fail later'. The failure would be predictable in the sense of being being deterministic for any particular code string but unpredictable in the sense of not knowing what code would be written. It would also be unpredictable when people use a keyword as variable without knowing that they had done so. The proposal reminds me of the current tkinter situation when using tcl/tk compiled without thread support: calling a tkinter function (method) from worker threads may work, at least initially, but may eventually fail. The main problem here is that this is not documented. But I suspect that if and when it is (https://bugs.python.org/issue33479), most will 'not do that' and only a few intrepid souls will do the necessary experiments. The impact on real-time syntax coloring should be considered. An re-based colorizer, like IDLE's, tags every appearance of keywords outside of strings and comments, syntactically correct or not. For instance, the first two 'and's in "a and b.and # and error" are tagged. To not tag the second would require full parsing, presumably by compiling to AST, and more support code. I suspect this would would make coloring slower. Whether too slow for existing machines, I don't know. -- Terry Jan Reedy

[Terry Reedy <tjreedy@udel.edu>]
I can pretty much guarantee it would be slower - way slower. But regexps could still do the bulk of it, provided the contexts remain as simple-minded as the specific ones Guido suggested: For example, we could allow keywords after 'def' and after a period, So when the IDLE colorizer regexp finds a match with the "KEYWORD" group name, don't color it at once. Instead look back from the start of the _possible_ keyword match, to see whether it's preceded by period maybe_whitespace or "def"-starting-at-a-word-boundary at_least_one_whitespace and if so skip coloring. There are many ways to code that, but - ya - they're annoying. Alas, the re module's negative lookbehind assertion isn't powerful enough for it, else those checks could be added directly to the keyword part of the colorizing regexp. I expect (but don't know) that lookbehinds in MRAB's "regex" module are strong enough for it. Of course that's just IDLE. It was remarkably easy to teach my primary editor (Source Insight) all sorts of stuff about .py files, but its "keywords" subsystem is restricted to specifying literal strings to look for. The kinds of tricks above probably require that the tool have access to a real programming language (like IDLE and Emacs enjoy).

Terry Reedy wrote:
the first two 'and's in "a and b.and # and error" are tagged. To not tag the second would require full parsing,
That particular case could be handled by just not colouring any word following a dot. Some other situations might result in false positives, but there are cases like that already. Many Python colourisers colour any occurrence of a builtin function name, even if it has been shadowed, and people seem to live with that. It might even be useful, if it alerts people that a name they're using is reserved in some contexts, and so might be best avoided. -- Greg

On 5/14/2018 6:58 PM, Greg Ewing wrote:
OK, more parsing, even if not complete. Other cases require looking at several other things before or after, depending on the exact proposal. Okay, more parsing, depending on the exact rules. Some of the variations (
For IDLE:
def int(): return 0 # int colored as defined name
int() # int colored as builtin. 0
Perhaps I should add a check whether defined names are builtins, and if so, color them as builtins even after 'def' and 'class'.
Terry Jan Reedy

I'm not half aware of the way everything works under the hood as the vast majority here, but I have an idea. It possibly is technically stupid when you really know how things work, but as I'm not sure, I'm submitting it anyways: Why not add the compatibility done at setup.py's level? Like having something like this: setup( name="LegacyLib", version="8.4.1", ... max_compatibility="3.4" # Defining here Python's max version for which this lib has been upgraded ) Of course, we may use any other word instead of "max_compatibility", like "designed_for", "python_version", or anything a better English speaker could think of. The point is, it would either: - when you install the library, rename all variables that are now keywords (we'd know the exact list thanks to max_compatiblity) by suffixing them with "_" - or set a flag that will do that when creating the *.pyc files. Possible problems/limitations I can already find: - There would still be possible errors when using variable names that are generated on the fly (I have no clue how this could ever be addressed) - It might get complicated at some point to know what to do, like when we have lib_a in some version depending on lib_b (with or without a max_compatibility version number), it is obvious that lib_a will use lib_b's original variable names (without the appended "_"), but our code which might also want to interact with lib_b would have to. Is it plain stupid? Are there lots of things I didn't think of? Could it be a possibility? -Brice Le 13/05/2018 à 20:19, Guido van Rossum a écrit :
participants (17)
-
Brendan Barnwell
-
Brice Parent
-
Carl Smith
-
Chris Angelico
-
Ed Kellett
-
Elias Tarhini
-
Eric Fahlgren
-
Greg Ewing
-
Guido van Rossum
-
Nathan Schneider
-
Nathaniel Smith
-
Nick Coghlan
-
Rhodri James
-
Rob Cliffe
-
Stephan Houben
-
Terry Reedy
-
Tim Peters