[Python-Dev] Please reject or postpone PEP 526

Koos Zevenhoven k7hoven at gmail.com
Mon Sep 5 09:59:01 EDT 2016


Sorry, I don't have time to read emails of this length now, and
perhaps I'm interpreting your emails more literally than you write
them, anyway.

If PEP 484 introduces unnecessary restrictions at this point, that's a
separate issue. I see no need to copy those into PEP 526. I'll be
posting my own remaining concerns regarding PEP 526 when I find the
time.

-- Koos


On Mon, Sep 5, 2016 at 4:46 PM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> On 5 September 2016 at 21:46, Koos Zevenhoven <k7hoven at gmail.com> wrote:
>> The thing I'm promoting here is to not add anything to PEP 526 that
>> says what a type checker is supposed to do with type annotations.
>
> PEP 526 says it doesn't intend to expand the scope of typechecking
> semantics beyond what PEP 484 already supports. For that to be true,
> it needs to be able to define expected equivalencies between the
> existing semantics of PEP 484 and the new syntax in PEP 526.
>
> If those equivalencies can't be defined, then Mark's concerns are
> valid, and the PEP either needs to be deferred as inadvertently
> introducing new semantics while intending to only introduce new
> syntax, or else the intended semantics need to be spelled out as they
> were in PEP 484 so folks can judge the proposal accurately, rather
> than attempting to judge it based on an invalid premise.
>
> For initialised variables, the equivalence between the two PEPs is
> straightforward: "x: T = expr" is equivalent to "x = expr # type: T"
>
> If PEP 526 always required an initialiser, and didn't introduce
> ClassVar, there'd be no controversy, and we'd already be done.
>
> However, the question of "Does this new syntax necessarily imply the
> introduction of new semantics?" gets a lot murkier for uninitialised
> variables.
>
> A strict "no new semantics beyond PEP 484" interpretation would mean
> that these need to be interpreted the same way as parameter
> annotations: as a type hint on the outcome of the code executed up to
> that point, rather than as a type constraint on assignment statements
> in the code *following* that point.
>
> Consider:
>
>     def simple_appender(base: List[T], value: T) -> None:
>         base.append(value)
>
> This will typecheck fine - lists have append methods, and the value
> appended conforms to what our list expects.
>
> The parameter annotations mainly act as constraints on how this
> function is *called*, with the following all being problematic:
>
>     simple_appender([1, 2, 3], "hello") # Container/value type mismatch
>     simple_appender([1, 2, 3], None) # Value is not optional
>     simple_appender((1, 2, 3), 4) # A tuple is not a list
>
> However, because of the way name binding in Python works, the
> annotations in *no way* constrain assignments inside the function
> body:
>
>     def not_so_simple_appender(base: List[T], value: T) -> None:
>         other_ref = base
>         base = value
>         other_ref.append(base)
>
> From a dynamic typechecking perspective, that's just as valid as the
> original implementation, since the "List[T]" type of "other_ref" is
> inferred from the original type of "base" before it gets rebound to
> value and has its new type inferred as "T".
>
> This freedom to rebind an annotated name without a typechecker
> complaining is what Mark is referring to when he says that PEP 484
> attaches annotations to expressions rather than types.
>
> Under such "parameter annotation like" semantics, uninitialised
> variable annotations would only make sense as a new form of
> post-initialisation assertion, and perhaps as some form of
> Eiffel-style class invariant documentation syntax.
>
> The usage to help ensure code correctness in multi-branch
> initialisation cases would then look something like this:
>
>    if case1:
>         x = ...
>     elif case2:
>         x = ...
>     else:
>         x = ...
>     assert x : List[T] # If we get to here without x being List[T],
> something's wrong
>
> The interpreter could then optimise type assertions out entirely at
> function level (even in __debug__ mode), and turn them into
> annotations at module and class level (with typecheckers then deciding
> how to process them).
>
> That's not what the PEP proposes for uninitialised variables though:
> it proposes processing them *before* a series of assignment
> statements, which *only makes sense* if you plan to use them to
> constrain those assignments in some way.
>
> If you wanted to write something like that under a type assertion
> spelling, then you could enlist the aid of the "all" builtin:
>
>     assert all(x) : List[T] # All local assignments to "x" must abide
> by this constraint
>     if case1:
>         x = ...
>     elif case2:
>         x = ...
>     else:
>         x = ...
>
> So I've come around to the point of view of being a solid -1 on the
> PEP as written - despite the best of intentions, it strongly
> encourages "assert all(x): List[T]" as the default interpretation of
> unitialised variable annotations, and doesn't provide an easy way to
> do arbitrary inline type assertions to statically check the
> correctness of the preceding code the way we can with runtime
> assertions and as would happen if the code in question was factored
> out to an annotated function.
>
> Stick the "assert" keyword in front of them though, call them type
> assertions rather than type declarations, and require all() when you
> want to constrain all assignments later in the function (or until the
> next relevant type assertion), and I'm a solid +1.
>
> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +


More information about the Python-Dev mailing list