
Andrew, Eduard, if you're going to suggest changes to Python, you really ought to
first learn the current version, 3.4, and use it in your examples, rather than 2.7. (And use PEP8 style, too.)
In simple cases like this, it doesn't make any real difference, and people can understand what you're suggesting--but it may still make people subconsciously less receptive (as in, "if he's not even willing to keep up with Python and see what we've improved in the last decade, what are the chances he'll have any good insight that's worth paying attention to that we haven't already thought of?").
I am agree with you. It was my fault. 2015-02-11 4:30 GMT+02:00 Stephen J. Turnbull <stephen@xemacs.org>:
Eduard Bondarenko writes:
So I can type whatever I want in else block and someone can detect it after the several years..
That's a shame. However, such errors are not the only way for severe errors to occur in rarely used branches. As long as your static tool can't handle *all* bugs, there's a cost/benefit question of whether the (1) programmer effort to add and maintain the functionality (including updating it for new features, especially keywords, and formerly illegal syntax), and (2) the additional time to run it are justified by the likely benefit in reduced bugginess.
The cost/benefit issue is of course always present, but it becomes discouraging here because the only way to detect non-syntax bugs before runtime is careful code review, and that is likely to catch most syntax bugs -- the benefit to additional compile-time checking is small.
Note that I'm not sure that the Python compiler currently knows enough to provide the warnings you want. Consider:
def doit(): print(a) a = "It works!" doit()
Because Python is a dynamic language, the names 'a' and 'doit' (and 'print', for that matter) must be looked up at runtime, returning objects. This means that compiling that program does not require that the compiler know whether 'a', 'doit', and 'print' are defined! So adding those warnings might require major changes to the compiler.
And the first thing that I have to do programming in Python is to download analyser tool..
The SEI would say "No no no! The first thing you need to do is *review your code*!" IDE developers would say "why aren't you using identifier completion?", etc, etc. This "need" for warnings from the python interpreter is very much a personal preference on your part. There's nothing wrong with having such preferences, and nothing wrong with expressing them here. You might come up with something with wide applicability to new users or to less frequent users that the heavy users who are the bulk of developers would miss.
However, I think that for both typical new users (at least those whose job title is marketer or sociologist or lawyer) and less frequent users the non-syntax G'IGO problem (where G' = good and G = garbage) is much larger than the occasional uncaught NameError. Such users don't branch as heavily as library module authors, typically branch only for easily conceptualized cases or cases observed early in development, and so often are quite well-served by "all errors are fatal at runtime". And of course this is all personal opinion, too, but I suspect similar ones are shared by many core Python developers (and I'm not one of those! so nothing I say is authoritative <wink />).
I understand the importance of static analyser tools, but in this case (to check simple things like a variables and functions names) it is like "To use a sledge-hammer to crack a nut.".
Again, this is a personal preference (and again, there's nothing wrong with having one or expressing it). The CPython community has generally agreed that the virtual machine (including the compiler and interpreter) should be kept as simple and clean of extra features as possible. A change to the compiler merely to warn about suspicious constructs is considered the "sledgehammer" here. On the other hand, because the constructs used by the compiler are exposed as ordinary classes, writing separate tools is fairly easy.
Also, I don't really understand your sledgehammer analogy. It's true that catching undefined names could be done separately, but running a static analysis tool will catch many other kinds of bugs that any given warning filter would not. There are many simple (one-pass, line-by-line) "linters" that only look within a single logical line for suspicious constructs, others that will check things like assignment before use. You can pick the level of analysis you like.
PS. Within C++ ^ *** Warning *** Every line of this program contains obscure and DANGEROUS constructs!
<wink />
or other languages I should use static analyser tools to check real bug-prone situation, but in new, modern Python I should use third- party tools to check the real simple errata.
I suppose from your mention of "third-party" tools in the case of C++ you mean "vendor-provided" static analyzer tools, though you didn't write that. The answer to the implied question is "yes and no". C++ compiler optimizations are notoriously buggy and backward- incompatible, and implemented differently by each vendor. And there's a tradition (maintained for "political" reasons in the case of GCC, leading to RMS's resistance to a plugin architecture and exporting the AST) of packing all functionality into a single command, even a single program. Vendor-specific, vendor-implemented checkers make a heck of a lot of sense. By contrast with GCC, LLVM's C and C++ tools look a lot like Python, with competing implementations of quite a few modules in and out of the LLVM project proper.
Now, in Python as I mentioned, it's relatively easy to write checkers, and there are many of them. There is some core agreement on the required features, but the degree of agreement is considered insufficient to justify adding a module to the standard library, especially as many of the various checkers are in active development getting new features. This isn't a blanket opposition; years ago support for "function annotations" was added, and just recently a specific syntax for type hinting has been decided (pretty much) and the PEP will probably be approved shortly, addressing both improved diagnostics and optimization needs.
Finally (and I think nobody has mentioned this yet), downloading third- party tools has always been fairly easy in Python due to the package repository PyPI, and continuous effort has been expended over the years on making it easier. In fact, for many practical purposes the "batteries included" philosophy has become impractical to support. So before you can begin to write a program, you need to download a module (and its documentation) to support functionality needed by your program. This is true for almost everybody nowadays. In response, the download and installation tool "pip" was recently added to the core distribution of both Python 2 and Python 3. That kind of generally useful effort is valued more in Python than simple addition of features that most likely would not be used by most Python programmers, not even implicitly in the standard library. And the community deliberately aims at such improvements.
Your mileage may vary, of course, but having observed the community for over a decade now I believe the above to be a fairly accurate expression of some of the core values of this community.
Regards,
Steve