On 1/12/21 11:27 PM, Chris Angelico wrote:
On Wed, Jan 13, 2021 at 6:11 PM Ethan Furman wrote:
Optimizations are an implementation detail, and implementation details should not change the language.
The language can also be defined in an optimization-friendly way, though. Consider how we got positional-only arguments in Python: first they existed in C-implemented functions in CPython, even though they couldn't exist in pure Python code, and then the functionality got added to the language definition, thus permitting the optimization.
1. What optimization? 2. Did the language change because of the optimization?
Or consider dictionary lookup. Most people treat it as "find a key which is equal to the one you're looking for", but the actual definition is "find a key which is identical to, or equal to, the one you're looking for".
Exactly. The definition, i.e. language spec, says identity, then equality.
The topic under discussion is a language definition. Choosing to permit the optimization doesn't mean that the implementation detail changes the language. Choosing to deny it means there won't be an optimization.
There are, I am sure, many optimizations that are not possible because of Python's dynamism. `if <something>` is supposed to evaluate `bool(something)`, regardless of what comes after.
I personally don't see any reason to force Python to calculate something unnecessarily, given that this is *already* happening in other situations (see the "if a and b:" optimization, which doesn't boolify twice).
Sure, and I agree that calling `bool()` a second time is wasteful, as well as possibly confusing -- Python already has the answer from the first `bool()` call, so why would it need to do it again? That seems a matter of correctness, not optimization. -- ~Ethan~