# Unbounded Tuples
I've added support for accepting `Ts` as an unbounded tuple `Tuple[Any, ...]` or `Tuple[int, ...]`. The type inference rules were similar to that in TypeScript (
https://github.com/microsoft/TypeScript/pull/39094).
This meant we could do:
```
def foo(xs: Tuple[*Ts]) -> Tuple[*Ts]: ...
def baz() -> None:
unbounded_tuple: Tuple[int, ...]
z = foo(unbounded_tuple)
# => Tuple[int, ...]
reveal_type(z)
def foo2(xs: Tuple[T, *Tuple[str, ...]]) -> T: ...
def baz2() -> None:
some_tuple: Tuple[int, str, str]
z = foo(some_tuple)
# => int
reveal_type(z)
```
I'm ambivalent about allowing such explicit unpacking of `Tuple[int, ...]`. Given that we need it for arbitrary-rank `Tensor` anyway, it seems cleaner to allow it, but it may be confusing for users.
Another thing we may want to consider in a future PEP: setting a bound for `Ts`. For example, we may want Tensor parameters to be bound by int: i.e., allow `Tensor[Literal[480], Literal[360]]` but not `Tensor[str, str]`. This could be done by essentially setting `TypeVarTuple("Ts", bound=Tuple[int, ...])`. This might be useful for future type arithmetic, since we'd need to be able to safely say something like `def flatten(x: Tensor[*Ts]) -> Tensor[Product[Ts]]:`.
# Arbitrary-rank Tensors
For gradual typing, we'd need to allow `x: Tensor`. Some library functions may be using `Tensor` without parameters until they are migrated to variadics. Calling them should not raise errors.
So, I treated `Tensor` without parameters as `Tensor[*Tuple[Any, ...]]`. (As Guido pointed out, `Tensor[Any, ...]` is not valid syntax.)
Gradual typing has two main requirements:
(a) `Tensor[int, str]` should be compatible with `Tensor`
```
def expects_arbitrary_tensor(x: Tensor) -> Tensor: ...
def bar() -> None:
tensor: Tensor[int, str]
y = expects_arbitrary_tensor(tensor)
reveal_type(y)
```
(b) `Tensor` should be compatible with a concrete `Tensor[int, str]`
```python
def expects_concrete_tensor(x: Tensor[int, str]) -> Tensor[int, str]: ...
def bar() -> None:
tensor: Tensor
expects_concrete_tensor(tensor)
```
(This is analogous to `List[Any]` being compatible with `List[int]` and vice versa.)
By default, both raised an error because Tensor is invariant. That is, we had to check that its parameters were compatible in both directions: (a) `[int, str]` is compatible with `[*Tuple[Any, ...]]` and (b) `[*Tuple[Any, ...]]` is compatible with `[int, str]`.
To be explicit, (b) is equivalent to checking that `Tuple[Any, ...]` is compatible with `Tuple[int, str]`. That is a problem because we don't generally consider `Tuple[Any, ...]` to be compatible with `Tuple[int, str]`. For example, Mypy raises an error:
```python
from typing import Any, Tuple
def expects_concrete_tuple(x: Tuple[int, str]) -> None: ...
def bar() -> None:
unbounded_tuple: Tuple[Any, ...]
# main.py:9: error: Argument 1 to "expects_concrete_tuple" has incompatible type "Tuple[Any, ...]"; expected "Tuple[int, str]"
y = expects_concrete_tuple(unbounded_tuple)
reveal_type(y)
```
To work around this, we could either
(i) allow Tuple[Any, ...] in general to be compatible with Tuple[int, str], or
(ii) special-case variadic classes like Tensor so that `Tensor` is compatible with `Tensor[int, str]` and vice versa.
Both are unsound. The tuple or tensor we pass in may have zero elements and may thus cause a runtime error. Or its element may be a type that can't be used as an `int` or `str`, which is again a runtime error.
However, option (ii) is less invasive, so I went with it. The Tensor examples typechecked fine. Let me know if anyone has strong opinions about option (i).
(Test cases:
https://github.com/pradeep90/pyre-check/blob/master/source/analysis/test/integration/typeVariableTest.ml#L3504-L4351)
****
I'll add these points to the PEP. I'll work on merging my changes into Pyre master, but this might take a few weeks because I'll have to replace the existing ListVariadic implementation.