I want to start off by thanking everyone that gave feedback on the first round of review of PEP 612. Your feedback illuminated a lot of things that I think have made this PEP a lot better.
Over the past few weeks I have had the time to rewrite much of that document to integrate all of that feedback.
Notably, this included integrating `typing.type_variable_operators.Concatenate` to support decorators that modify a finite number of positional parameters.
With that being said, I'd appreciate another round of feedback from all of you to determine whether this form of the PEP is ready to move on to the Steering Council.
Next week we'll be holding our next tensor typing meeting.
WHEN? Monday 3rd August, 9:30am SF time.
WHERE? Google Meet. We'll post the link on the day itself.
1. Status updates. What have folks been working on over the past month?
2. What should symbols mean? Jörg and I would like to make a second attempt
at making the case that symbols should by default refer to semantic axis
types (e.g. `Batch` => a batch-like axis) rather than actual shapes
(`Batch` => a concrete value of '32'). We'll have a short presentation and
then we'll open it up for discussion.
3. Other. Is there anything else folks would like to talk about?
I was excited to learn about SupportsInt, SupportsFloat, and
SupportsComplex protocols decorated with @runtime_checkable because I
thought they bridged the gap between ABCs and typing protocol classes,
allowing consistent type checks by static tools and at runtime.
However, I am unable to see how to use those classes in practice.
For example, SupportsComplex: I understand at runtime the
@runtime_checkable decorator merely checks whether __complex__ exists
in the type, but that is not a good predictor of how to use a numeric
Types that actually can be converted to complex fail the insinstance
check with SupportsComplex. That's the case of built-in int and float,
as well as NumPy integer and float types like float16 and uint8.
On the other hand, SupportsInt suggests that I can convert a complex
to int, but in fact the int and float types implement __complex__ only
to provide a better error message, according to Guido van Rossum in an
earlier thread here.
>>> isinstance(1+0j, SupportsInt)
Traceback (most recent call last):
TypeError: can't convert complex to int
The issue with SupportsInt also happens with SupportsFloat.
In addition, the way Mypy interprets these protocol classes is not the
same as what we can see at runtime.
Given the above, what are recommended use cases for the runtime
checkable SupportsInt, SupportsFloat, and SupportsComplex?
| Author of Fluent Python (O'Reilly, 2015)
| Technical Principal at ThoughtWorks
| Twitter: @ramalhoorg
Here's a status update for PEP 604:
- Ivan has asked the SC to find another PEP delegate for personal reasons,
and the SC has asked me to step up in his place. Philippe, the PEP's
author, has agreed.
- Maggie Moss has volunteered to write an implementation which is currently
in progress at https://github.com/python/cpython/pull/21515.
- I am quite positive regarding the PEP, I just think we need to tighten
the specification some more, especially in corner cases and for
interoperability with types defined in typing.py. I will work with Maggie
and Philippe on this. I see no problem getting this into 3.10. It will make
a nice complement to the list[int] notation introduced in 3.9 by PEP 585.
Here's a summary of the key points from the PEP:
- `int | str` represents the union of `int` and `str`
- There will be a new extension type, `types.Union` (1), to represent
unions created using `|`
- Such types will be acceptable in `isinstance(x, t)` and `issubclass(C, t)`
The rest is elaboration; most semantics are derived from existing semantics
of `typing.Union`. We also strive for decent interoperability with the
(1) Beware! The existing union notation uses `typing.Union`; the new
notation uses `types.Union`.
--Guido van Rossum (python.org/~guido)
*Pronouns: he/him **(why is my pronoun here?)*
Hi list! I was asked to bring this up here. When PEP604 was first introduced last year I noticed a few under-specified parts, namely how it interacts with ForwardRef and the edge case of `None | None`
>>> from typing import Union
>>> Union[int, 'str']
>>> int | 'str'
# should this raise an error, or return the same as above?
>>> Union[None, "ref1"]
>>> None | "ref1"
# should this construct (None first) be forbidden, or does str.__or__ or NoneType.__or__ need to be implemented?
>>> Union[None, None]
>>> None | None
# should be forbidden? It seems very unlikely to come up in practise even if typing.Union supports it
>>> "ref1" | "ref2"
# would this be str.__or__ returning types.Union?
In the discussion last year, it was suggested that there could be a `typing.Forward` that could be used instead of the bare string "ref" cases, or even requiring the union to be fully quoted eg `T = "Ref | OtherRef"`.
Hello again, I'm working on a library that uses the return type of methods at runtime to generate GraphQL types and I'm trying to
deal with the case of circular imports and classes referencing each others.
I was thinking of create a LazyType where you can pass the type name and the module, so that we are able to resolve
the actual type at runtime without having to guess where it came from, like this:
from .type_a import TypeA
def type_a(self, info) -> strawberry.LazyType["TypeA", ".type_a"]:
from .type_a import TypeA
As previous post unfortunately I can't relay on sys.module, since the type is only imported when TYPE_CHECKING is True or inside the method.
Now, I managed to make it work with a custom LazyType class, so the annotation above works;
I don't dislike this to be honest, but looks like mypy isn't really happy, as you can see from the errors here:
tests/test_cyclic/type_b.py:13: error: "LazyType" expects no type arguments, but 2 given
tests/test_cyclic/type_b.py:13: error: Invalid type comment or annotation
tests/test_cyclic/type_b.py:16: error: Incompatible return value type (got "TypeA", expected "LazyType")
The first one could be resolve by using generics, but I'm not sure that's a good idea. The last one could be solved by creating a plugin, hopefully it won't be too difficult.
But I have no clue about `Invalid type comment or annotation`. Why would that be the case?
I noticed that the error disappears when removing the dot from the module name, but I'd like to be able to pass a dot there, so
users of the library don't have to type the full module, since we can infer that.
Any ideas? Or do I need to change the approach for this? Maybe doing a less typehint-y way:
def x() -> strawberry.LazyType("TypeA", ".type_a"):
Let me know! :)
Hi folks, I'm working on a library that makes extensive of use of type hints at runtime and I found this use case where I need to
evaluate a forward reference
as youy can see List["Review"] has a reference to the not yet declared class Review.
My use case is to generate class based on the type hints of these 3 classes, unfortunately
when I try to inspect the field type of Product.reviews, I get List[ForwardRef("Review)]
and I'm not sure how I could get the actual type at run time.
I've made a repl that shows what I've tried so far: https://repl.it/@patrick91/dcth-2
I've tried getting the global namespace for the module, but that won't work in this case, since
the classes are defined inside a function.
I have another option, which is to store all the decorated classes in a global dict (I'm using a different decorator that the dataclasses one),
but I don't want to do that as my users could register multiple types with the same name.
Any ideas on how I could solve this?
I would love to share something I have been working on for the last year!
I have implemented an emulation of Higher Kinded Types for mypy.
Here I would love to describe how it works and (hopefully!) receive your
Some context before we start:
1. What are Higher Kinded Types (or HKT)? It is basically a concept of
"generics of generics". Here's the simplest example:
def build_new_object(typ: Type[T], value: V) -> T[V]: ...
# won't work, because HKT is not natively supported
2. What is "emulated" HKT support? There's a very popular approach to model
HKT not as `T[V]` but as a `Kind[T, V]`. Original whitepaper:
3. Why would anyone need this? That's a good question. HKT gives a lot of
power. It is widely used with functional programming. You can find a full
example here: https://sobolevn.me/2020/06/how-async-should-have-been
With HKT one can rewrite `@overload`ed function for both async and sync
client_get: Callable[[str], Kind2[_IOBased, httpx.Response, Exception]],
) -> Kind2[_IOBased, int, Exception]:
... # implementation will be slightly different
4. Is it ready for the production use? No! It is just an early prototype.
There are a lot of things to polish. There are some known hacks and
limitations right now (that can possibly stay forever).
Now, let's dig into how it works!
First of all, there's a new primitive to represent `Kind`, it is called
`KindN` and has 3 (at the moment, it might change in the future) aliases
`Kind1`, `Kind2`, `Kind3`. Why?
Kind1 represents types with one type variable, like `List[X]`
Kind2 represents types with two type variables, like `Result[Value, Error]`
Kind3 represents types with three type variables, like `Generator[X, A, B]`
KindN is a base type for all of them, it can possibly have any number of
Now, let's say I want to build a `map_` function to map a pure function
over a `Functor` instance. How should I do that?
I need to:
1. Implement `Mappable` (or `Functor`) interface
2. Implement some data type that implements your `Mappable` interface
3. Implement some hacks, that I've mentioned earlier, to make sure that you
will be able to work with `KindN` values
4. Implement the `map_` function itself
Let's start with the interface:
_MappableType = TypeVar('_MappableType', bound='MappableN')
class MappableN(Generic[_FirstType, _SecondType, _ThirdType]):
Allows to chain wrapped values in containers with regular functions.
Behaves like a functor.
def map( # noqa: WPS125
function: Callable[[_FirstType], _UpdatedType],
) -> KindN[_MappableType, _UpdatedType, _SecondType, _ThirdType]:
"""Allows to run a pure function over a container."""
#: Type alias for kinds with one type argument.
Mappable1 = MappableN[_FirstType, NoReturn, NoReturn]
This is pretty straight-forward. Some things to notice:
- `_MappableType` must be bound to `MappableN`, we use this type to
annotate `self`, this is required
- It returns modified `KindN` instance, we work with the first type
- method is abstract, because we would need an actual implementation from
the child types
Now, let's write some real data type to satisfy this contract. I will just
use the code from here:
Some things to notice:
- We use `Kind1` as a supertype for our own data type, it means that we
only have one type argument, this is required. Notice that we pass
`'MyClass'` forward reference into `Kind1`
- mypy makes sure that `map` method defintion exists and is correct, any
violations of the type contract will make mypy to raise errors
Now, let's try to implement the `map_` function for any `Mappable` and any
That's how it will look like:
from returns.primitives.hkt import KindN, kinded
_MappableKind = TypeVar('_MappableKind', bound=MappableN)
container: KindN[_MappableKind, _FirstType, _SecondType, _ThirdType],
function: Callable[[_FirstType], _UpdatedType],
) -> KindN[_MappableKind, _UpdatedType, _SecondType, _ThirdType]:
Things to note:
- We need to mark this function as `@kinded`, because we need to tweak its
return type with a custom plugin. We don't want `KindN` instances, we
need real types!
- We need to bound `_MappableKind` to our `MappableN` interface to make
sure we will have an access to its methods (`.map` in our case)
The only thing left is the implementation. The obvious way: `return
container.map(function)` won't work. Because:
1. `KindN` does not have `.map` method, we need to somehow get to
2. It will have wrong return type: `KindN[MappableN, _UpdatedType,
_SecondType, _ThirdType]` while we need `KindN[_MappableKind, _UpdatedType,
_SecondType, _ThirdType]` (the difference is in the first type argument).
So, we need to use the dirty hacks!
from returns.primitives.hkt import debound
new_instance, rebound = debound(container)
What does it do?
1. It returns a proper `new_instance` with `MappableN` type
2. It also returns a callback to coerce the return type to the correct one
The only thing left is to test it!
Here are the tests:
To sum things up:
1. Current state of HKTs in Python allows defining complex type contracts
2. We are able to write `@kinded` methods and functions that do return the
correct types in the end
3. We have some limitations: like max 3 type parameters at the moment
4. There are some minor API tweak when working with kinds: debound, dekind,
5. There are also some blockers from the mypy side to improve some API
parts, like: https://github.com/python/mypy/issues/9001
I would love to answer any questions you have.
And I hope to hear your feedback on my work.
(Moving this thread here from the other mailing list.)
Thanks for the update, Alfonso. Cool to know there's active work on this in
Pyre - and that proper support for variadics is close to becoming a
reality, if Mark is planning to submit a PEP.
In that sense, I believe that there are many reasons to think that is not a
> good idea to create yet another type checker for Python. In the particular
> case of Deepmind, my humble suggestion would be to contribute to Mypy or
> Pyre, at least for the part related with the type system.
Right. Initially I was thinking it might be better to have a separate tool
for shape checking so that it might be easier to plug into existing
infrastructure for folks who have already committed to a particular type
checker (e.g. we use pytype, so if we implemented support in Mypy we'd have
to run both, and that seemed like it might have some complications) - but
on reflection I agree it would be better to contribute to an existing
checker a) to make use of the existing framework that checker would provide
and b) yeah, to avoid proliferation of tools.
(Also, to be clear: we don't have any official work on this at DeepMind;
just a few of us interested and playing around in our free time.)
for example arithmetic on types, which actually is something that we are
> currently working on, in case that anyone is interested.
So it sounds like Pyre already has a pretty developed roadmap for this -
super cool :) In that case, maybe the question is: what are you working on
already, and what's left to be done - that is, what external contributions
would be most useful?
On Mon, 15 Jun 2020 at 11:39, <proyect.hd(a)gmail.com> wrote:
> Hi everyone,
> Thanks once again for bringing attention to this important topic.
> I think that we all agree that the main step in this direction is the
> introduction of variadics, what has been mentioned several times 1
> , 2 <https://github.com/python/typing/issues/513>, 3
> Variadic support is more mature than it seems. In the case of Pyre, from
> already one year ago we have support for variadics. The first official
> proposal that I recall was at last year Python Typing Summit (here
> The syntax is aligned with the proposal that Guido has shared. However,
> iirc the initial syntax relied to much on Concatenate/Expand making it
> verbose and ambiguous when there are 2 variadics. For that purpose, the
> current syntax relies on capture groups "" for manually specifying the
> part of the type that correspond to the variadic, and only requires
> Concatenate for concatenating types and variadics. More about the final
> syntax can be seen here (here
> Although I don't want this to become a PEP, the way the syntax works with
> the proposed example is:
> tf.Tensor[Batch, Time, [64, 64, 3]]
> Special cases could be considered to make it more ergonomic when there is
> only one variadic at the end, in general having an unambiguous syntax is a
> Regarding maturity, afaik Mark Mendoza had informal conversations with the
> community regarding the proposed syntax and got positive feedback, and he
> currently plans to submit a PEP, once the Parameter Specification PEP gets
> merged (here <https://www.python.org/dev/peps/pep-0612/>).
> Hence, if we assume that we are not so far from agreeing on a final
> syntax, then the next question is about having actual support for it. As I
> mentioned, Pyre already supports it so it could be used for testing ideas
> and giving developers the opportunity of writing code stubs before other
> type checkers get support. In that sense, I believe that there are many
> reasons to think that is not a good idea to create yet another type checker
> for Python. In the particular case of Deepmind, my humble suggestion would
> be to contribute to Mypy or Pyre, at least for the part related with the
> type system. Regarding Teddy's work, afaik Teddy was trying to contribute
> to Mypy itself so perhaps his checker is more a proof of concept. I hope
> that he will tell us more about it.
> About static shape checkers that rely on abstract interpretation, for sure
> there is room for them, but it will be better if they rely on the more
> advanced type specifications that variadics will introduce. After all,
> variadics at this point will only be useful for some use cases, since for
> many tensor operations other functionalities are need, for example
> arithmetic on types, which actually is something that we are currently
> working on, in case that anyone is interested.
> Finally, if there are many teams working in this direction I would suggest
> to organise some online meetings, inspired in what they do at MLIR (here
> PS: I would propose to keep future discussion in Python Typing mailing
> list. Many people that would be interested in this sort of discussions
> follow that list.
On Mon, 15 Jun 2020 at 11:36, Matthew Rahtz <mrahtz(a)google.com> wrote:
> Thanks for the links, Guido! That first one in particular has some
> interesting ideas I hadn't seen before. I'll incorporate them as options
> into the doc we're preparing.
> Adam -
> IMO the biggest bottleneck to porting my tool, or implementing any other
>> one, is a good abstract interpreter for Python that can handle programs
>> that are incomplete, have syntax errors, etc
> Interesting. I'm guessing you're saying this with a view to having good
> support in e.g. IDEs? Sergei, am I right in thinking you have some
> experience with this? I guess PyCharm must also have some internal solution
> to this; I'm not sure how hard it would be to use it as a standalone tool...
> I also wanted to add that in my experience implementing an interpreter in
>> Python will likely make it more difficult to write something good in the
>> long term, because we'll miss out on all the nice things like ADTs with
>> pattern matching, exhaustiveness checks, etc.
> Also interesting. Yes, let's keep this in mind.
> On Fri, 12 Jun 2020 at 17:24, Guido van Rossum <guido(a)python.org> wrote:
>> Here are some links to docs with well thought-out ideas for additions to
>> the standard Python (static) type system as described by PEP 484 to support
>> numpy(-ish) array types and shapes:
>> Here's a doc I've held on to (mostly written by Ivan Levkivskiy) with a
>> list of proposals that made the rounds at PyCon 2019 (and even before).
>> And check out the typing summit schedule:
>> On Fri, Jun 12, 2020 at 9:19 AM 'Adam Paszke' via Python shape checkers <
>> python-shape-checkers(a)googlegroups.com> wrote:
>>> Hi everyone,
>>> Thanks a lot for starting the list Matthew! It'll be great for all of us
>>> to get together.
>>> Since you've asked about the stage of my project, the Swift prototype is
>>> working pretty nicely, but I didn't get to porting it to Python (except for
>>> a *very early* prototype in Haskell). However, if there is sufficient
>>> interest in such an effort, then I'm happy to reprioritize and I'm pretty
>>> sure that I could spend even up to 50% of my time on that.
>>> IMO the biggest bottleneck to porting my tool, or implementing any other
>>> one, is a good abstract interpreter for Python that can handle programs
>>> that are incomplete, have syntax errors, etc. That's an effort we could
>>> definitely share, and in the future we can even prototype multiple
>>> different checkers on top of this common infrastructure. In Swift this was
>>> easy, because I could write a very simple interpreter for SIL (Swift
>>> Intermediate Representation), which was already quite low-level. Mirroring
>>> Python's semantics faithfully will be much more difficult as even "trivial"
>>> operations like attribute lookups can get extremely complicated.
>>> I also wanted to add that in my experience implementing an interpreter
>>> in Python will likely make it more difficult to write something good in the
>>> long term, because we'll miss out on all the nice things like ADTs with
>>> pattern matching, exhaustiveness checks, etc. Haskell/OCaml make those
>>> things a breeze. I do understand that they might make onboarding a little
>>> slower, but I wouldn't want to rule them as an option out just yet, as it
>>> makes a huge difference.
>>> Also, it's super cool to see that other people are getting to
>>> implementing more static checkers too. I'll have to read Teddy's thesis
>>> soon to better understand how it relates to what I did.
>>> On Fri, Jun 12, 2020 at 1:47 PM 'mrahtz' via Python shape checkers <
>>> python-shape-checkers(a)googlegroups.com> wrote:
>>>> Hey everyone!
>>>> First off - welcome to the group! There's been scattered interest in
>>>> shape checking for some time, so in coming all together in one place here
>>>> rather than in scattered email threads and GitHub issues and Slack channels
>>>> I'm hoping we can push this through to something suitable for widespread
>>>> To summarise the current state of shape annotation and checking, there
>>>> are three categories of things to care about:
>>>> - Defining the syntax for how code should be annotated with shapes
>>>> - Runtime shape checkers
>>>> - Static shape checkers
>>>> Stephan made a great start with Ideas for array shape typing in Python
>>>> A group of us at DeepMind have been working on a followup which goes into
>>>> more detail which we should be able to share soon once it's been cleaned up.
>>>> There's also the syntax that tsalib <https://github.com/ofnote/tsalib> uses,
>>>> though only allows annotation of shapes (without specification of e.g.
>>>> whether something is a tf.Tensor or a np.ndarray, and without any info
>>>> on data type).
>>>> *Runtime shape checkers*
>>>> Some existing options here include ShapeGuard
>>>> <https://github.com/Qwlouse/shapeguard> (which is what most folks at
>>>> DeepMind are using at the moment) and tsanley
>>>> <https://github.com/ofnote/tsanley>. We also have an internal
>>>> prototype at DeepMind that can check annotations like x:
>>>> tf.Tensor[Batch, Time, 64, 64, 3] which we're still working on.
>>>> *Static shape checkers*
>>>> This is where things get interesting. A static shape checker is
>>>> probably what it's going to take for us to be able to say the problem of
>>>> shape checking is 'solved'.
>>>> One bottleneck here is support for variadic generics in existing static
>>>> type checkers. Pyre apparently has prototype support for this in its
>>>> <https://github.com/facebook/pyre-check/blob/master/examples/pytorch/stubs/_…> type
>>>> (see also this PyTorch example
>>>> <https://github.com/facebook/pyre-check/blob/e16cfea51df9466bc84841065fb3149…>) but
>>>> as far as I know neither pytype or mypy support this yet.
>>>> In the meantime, Teddy Liu has developed a toy static checker
>>>> <https://github.com/theodoretliu/thesis> for his bachelor's thesis.
>>>> It's written in OCaml, but reckons it should be portable to Python if
>>>> Outside of the Python world, Adam Paszke has made a static checker for
>>>> Swift for TensorFlow <https://github.com/google-research/swift-tfp>.
>>>> He's also interested in developing something similar for Python.
>>>> *Next steps*
>>>> I think the general direction of next steps should be to continue
>>>> developing a static checker while simultaneously trying to work out the
>>>> details of what a full syntax would look like (based on what we find to
>>>> work well in practice with the static checker).
>>>> In particular, I'm wondering whether it would be worth porting Teddy's
>>>> checker to Python (which I'm assuming more people will be comfortable with
>>>> than OCaml) or whether we should join Adam in developing something from
>>>> scratch. Adam, how are things going on your end?
>>>> I'll be able to work on this full-time for two weeks from the 22nd as
>>>> part of a DeepMind hackathon, during which my plan is to finish the draft
>>>> syntax proposal doc and polish an internal prototype we have for a runtime
>>>> checker based on that syntax - but I'm also open to other ideas, if there
>>>> are other higher-leverage things to do.
>> --Guido van Rossum (python.org/~guido)
>> *Pronouns: he/him **(why is my pronoun here?)