Looking at the implementation, I think this can be significantly simplified (and avoiding a lot of the pitfalls that Guido mentioned), by considering the following: - You don't really care about the run-time call chain (which could go up or down your syntactic tree of function/class defs), and that's what the frames represent. You really care about the static structure. You're workarounding that by pulling f_outer_typeparams from the function object's fun_typeparams, each time you start a new frame.... but in the end that means you don't need to keep that in the frame. Each time you use frame->f_outer_typeparams, you could actually be using frame->f_func->func_typeparams, so that is no longer needed - The f_local_typeparams is kind of volatile. You always create it for a function definition, use it a few opcodes later, and then its content is no longer relevant. I'm pretty sure this could end up somewhere in the interpreter stack (and it goes away after your MAKE_FUNCTION, or the creation of the class) You shouldn't need cells for this, and I think it can be done with minor changes to your existing implementation. In fact, if you want to go the extra mile, given that the typevar structure of scopes is known statically(it's the one in the source and execution independent), and typevars are immutable, you could create typevars in compile-time and store them into the code objects rather than the functions. Then you could have a LOAD_TYPEVAR which works very similar to LOAD_CONST pulling the typevar. There are some complexities for this (especially in changes to code objects and their deserialization): if you want to keep the identity of typevars from outer scopes to inner scopes you need to pass the containing code block typevars when building (or unmarshalling!) code objects, and you need to define a way to serialize bounds at compile time in a way that marshall can support that (perhaps stringify them, or add some sort of function to evaluate them later PEP-649 style). That would be super fast: import my_mod would ensure you have all your typevars created, you'll have exactly 1 typevar per typevar definition in your source, and the compiler doesn't need to do much more than you're already doing. The only runtime const of looking up a typevar is going from the frame to the code object to the typevar tuple, and indexing. Hope this helps! D. On Sat, 9 Jul 2022 at 02:20, Eric Traut <eric@traut.com> wrote:
Jella and I met yesterday, and he provided me with additional insights about how runtime type checkers use type information. From that conversation, I concluded that my proposed approach (the one that involves type parameter "proxies") probably wouldn't meet the needs of runtime type checkers. We did some additional brainstorming, and I spent some time today exploring alternatives. I think I have a design that meets all of the requirements.
Like the previous proposal, this new design doesn't require the use of "lambda lifting", which I still consider problematic for a number of reasons.
The new design involves two new fields in the "frame" object at runtime. These fields track which type parameters are "live" at runtime. Both of these fields can be NULL or refer to a tuple of TypeVar-like objects (TypeVar, TypeVarTuple, ParamSpec). The first field is called "f_outer_typeparams" and contains all of the type parameters defined by outer scopes. The second field is called "f_local_typeparams". It contains all of the outer type parameters plus any type parameters that are temporarily needed for a generic "class", "def" or "type" statement in the current scope. This proposal introduces two new opcodes: EXTEND_TYPEPARAMS and LOAD_TYPEPARAM. The EXTEND_TYPEPARAMS op builds a new "f_local_typeparams" tuple from the "f_outer_typeparams" tuple and n new TypeVar-like objects that have been pushed onto the stack. The LOAD_TYPEPARAM op loads a single type parameter from "f_local_typeparams" referenced by numeric index. The compiler tracks the indices of all "live" type parameters , so it's able to emit the appropriate index as part of the opcode. When a new scope is entered (e.g. during a function or lambda call or the execution of a comprehension), the "f_local_typeparams" tuple is copied into the "f_outer_typeparams" field of the next scope. Through this mechanism, all of the "live" type parameter objects are always available even in inner scopes.
Here's an example: ```python # At this point, f_outer_typeparams and f_local_typeparams are NULL
# The compiler emits code to construct two TypeVar objects for `A` and `B`, then # emits a EXTEND_TYPEPARAMS(2) op. This builds a new f_local_typeparams tuple # that contains (A, B). When evaluating the expression `dict[A, B]`, the reference to # `A` generates a LOAD_TYPEPARAM(0), and the reference to `B` generates a # LOAD_TYPEPARAM(1) op.
class Outer[A, B](dict[A, B]): # At this point, f_outer_typeparams and f_local_typeparams contain (A, B).
# The compiler emits code to construct two TypeVar objects for `C` and `D`, then # emits a EXTEND_TYPEPARAMS(2) op. This builds a new f_local_typeparams tuple # that contains (A, B, C, D). When evaluating the expression `dict[B, D]`, the reference to # `B` generates a LOAD_TYPEPARAM(1), and the reference to `D` generates a # LOAD_TYPEPARAM(3) op.
class Inner[C, D](dict[B, D]): ...
# At this point, f_outer_typeparams and f_local_typeparams contain (A, B).
# The compiler emits code to construct one TypeVar object for `X`, then # emits a EXTEND_TYPEPARAMS(1) op. This builds a new f_local_typeparams tuple # that contains (A, B, X). When evaluating the type annotations for parameters `a`, # `b` and `x`, the appropriate LOAD_TYPEPARAM ops are generated.
def method[X](self, a: A, b: B, x: X): ... ```
print(Outer.__parameters__) (~A, ~B) print(Outer.Inner.__parameters__) (~C, ~D) print(Outer.method.__annotations__) {'a': ~A, 'b': ~B, 'x': ~X}
The f_outer_typeparams value is also stored in a function object in a new internal field called func_typeparams. This allows allows f_outer_typeparams to be restored to the current scope when resuming a coroutine, executing an async function, calling a lambda, etc.
Other than the addition of two new fields to the frame object, one new field in the function object, and two new opcodes, this proposal adds no significant complexity to the compiler or the runtime. Performance impact should be negligible. It works well with the PEP 649 proposal for deferred evaluation of annotations, and it works well with today's (non-deferred) annotation evaluation.
One small complexity is with forward-referenced annotations and the use of `get_type_hints()`, which will need to populate the `locals` dictionary with the "live" type parameters before calling `eval()` on the annotation string. The live type parameters are available on function, class, and type alias objects via a new `__typeparams__` attribute.
print(Outer.__typeparams__) (~A, ~B) print(Outer.Inner.__typeparams__) (~A, ~B, ~C, ~D) print(Outer.method.__typeparams__) (~A, ~B, ~X)
Let me know if you see any holes or have concerns with this proposal. If you're interested in looking at the CPython implementation, check out this branch: https://github.com/erictraut/cpython/commits/type_param_syntax2.
Assuming that this design is amenable to the typing community, my next step is to post an update to the PEP that describes the design. At that point, I think we can notify the python-dev community that the PEP is ready for their review.
-Eric _______________________________________________ Typing-sig mailing list -- typing-sig@python.org To unsubscribe send an email to typing-sig-leave@python.org https://mail.python.org/mailman3/lists/typing-sig.python.org/ Member address: dfmoisset@gmail.com