On Thu, Jan 21, 2021 at 10:09 PM Eric Traut <eric@traut.com> wrote:
I started to implement parts of the draft PEP 646 within pyright this evening. I figured this exercise might be helpful in informing our discussion and teasing out additional questions and issues. Here's what I've uncovered so far.

This is awesome work -- I always learn so much about a design by trying to implement it! Your experiences and observations are very useful. (I wonder if any of the PEP authors have worked on an implementation yet? Or anyone else?)
 
*Grammar Changes*
The grammar will need to change to support star expressions (i.e. unpack operator) within type annotations and in subscripts. If unpack operators are permitted within a subscript, how will that be handled at runtime? For example, in the expression `x[A, *B]`, what value will be passed to the `__getitem__` method for instance `x`? Will the runtime effectively replace `*B` with `typing.Unpack[B]`?

We could give `TypeVar()` an `__iter__()` method like this:
```
def __iter__(self):
    yield f"*{self.__name__}"
```
That would make for a nice repr() of things like `tuple[*T]`. It wouldn't support runtime analysis -- we could support that by creating another helper object, but honestly I don't think that's going to be very useful.

This should be spelled out in the PEP though in case there are people who *do* plan on doing runtime analysis on types involving `*T`.
 
Will star expressions be allowed in slice expressions? I presume no.

Nope.
 
*Zero-length Variadics*
It's legal to pass no arguments to a `*args` parameter. For example:
```python
def foo(*args: Any): ...
foo() # This is fine
```

*Unknown-length Variadics*
Am I correct in assuming that it's not OK to pass zero arguments to a `*args` parameter that has a variadic TypeVar annotation?
```python
def foo(*args: *T): ...
foo() I presume this is an error?
```

I had assumed this would be allowed -- the type would be that of the empty tuple (spelled `Tuple[()]`). I haven't looked but I could imagine that some of the many functions in some of the popular array/tensor packages might be inconvenienced if this were disallowed.

(PS: That's a separate piece of grammar that's not covered by PEP 637, we should call this out in PEP 646.)
 
Also, it's generally OK to pass an arbitrary-length list of arguments to an `*args` parameter using an unpack operator.
```python
def foo(*args: Any): ...
def bar(x: Tuple[int, ...]):
   foo(*x) # This is allowed
```
I presume that it should be an error pass an arbitrary-length list of arguments to an `*args` parameter if it has a variadic TypeVar annotation?
```python
def foo(*args: *T): ...
def bar(x: Tuple[int, ...], y: Iterable[int], z: Tuple[int]):
   foo(*x) # I presume this is an error?
   foo(*y) # I presume this is an error?
   foo(*z) # This is allowed
```

Could we make `foo(*x)` work? At least this could work:
```
def foo(*args: *T) -> Tuple[*T]:
    return args
def bar(*x: *int):
    y = foo(*x)
```
The type of y would be `Tuple[int, ...]`.
 
If my assumption is correct that this should be flagged as an error by a type checker, will it also be a runtime error? I'm guessing the answer is no, there's no way to distinguish such an error at runtime.

Which makes it more questionable to disallow it in the checker. (Of course there's plenty of working  code that's disallowed by checkers, but this seems an odd edge condition to differ about.)

If my assumption is incorrect and this permitted, does the variadic type variable effectively become "open-ended" (i.e. the dimensionality of the variadic becomes unknown)? If so, how does it work with concatenation? I think it's better to make this an error.

 It's easier for the checker implementation. :-) It doesn't strike me as theoretically unsound though. The length may be unknown at compile time, but it is not infinite, so things like `Tuple[int, *T, str]` are still well-defined at runtime.
 
* Other Observations *
This won't be an easy PEP to implement in type checkers, even with the simplifications I've recommended.  It's going to be a heavy lift, which means it's going to be a long time before all type checkers support it. Compared to other recent type-related PEPs like 604, 612, 613, and 647, this PEP will require significantly more work to implement and get everything right. That could significantly delay the time before it can be used in typeshed and other public stubs. This bolsters my conviction that we should embrace simplifications where possible.

I am sensitive to this observation. We could either plan follow-up PEPs with more advanced features, or we could define multiple phases of support in PEP 646 itself.
 
After this exercise, I'm even more convinced that we should support only unpacked usage ("*T") and not support packed ("T") for variadic type variables — and that we should use the existing TypeVar rather than introducing TypeVarTuple. In the rare cases where a packed version is desired, it can be expressed as `Tuple[*T]`. For example:

```python
T = TypeVar("T")
def func(*args: *T) -> Tuple[*T]: ...
```

By requiring `*T` everywhere for a variadic type variable, we will reduce confusion for users and simplify the spec and implementation.

But why couldn't `-> T` work for that example? Well, maybe you're right, if a user were to get confused and write `-> Tuple[T]` where they meant `-> Tuple[*T]` that would be quite the mess to sort out.

FWIW the thing that always makes my brain hurt is the `*args: *T` notation. I try to use the heuristic that for the case `*args: X` the type of `args` is `Tuple[X, ...]`. From that I can usually make the derivation that for `*args: *T` the type of `args` is `Tuple[*T, ...]`. And then I just have to realize that the `...` isn't needed here. (Which is easier if you pretend that `Tuple[str, int, ...]` means a tuple of a str followed by zero or more ints.)

--
--Guido van Rossum (python.org/~guido)