![](https://secure.gravatar.com/avatar/cd3339dac26b3e1ba648f3b80515ef7c.jpg?s=120&d=mm&r=g)
Thanks for your thoughts so far everyone!
Eric Traut and I recently discussed this topic for pyright: https://github.com/microsoft/pyright/discussions/2300. I'm definitely on board with the idea of marking deprecations in a way that is visible to type checkers.
Aha, the ideas there are quite similar. A new type typing.Deprecated would enable deprecating parameters/constants/etc. A decorator, in addition to the type, would also be nice. (And if we use typing.Annotated, the type alone would force the users to include the original type since Annotated doesn't support inference. So a decorator would also be more friendly to use.) But if we can just use typing.Deprecated and make it accept two optional parameters: 1) the original type; 2) the deprecation reason, then users don't need to include the original type: ```python def foo(x, y: typing.Deprecated): ... def foo2(x: int, y: typing.Deprecated[int]): ... def foo3(x: int, y: typing.Deprecated[int, "deprecation reason"]): ... ``` And also in my proposal, I'm using the return value of the method/function to mark as deprecated: # The following means the function `foo4` is deprecated: def foo4(x: int) -> typing.Deprecated[str]: ... I'm not quite happy about this, since unfamiliar readers could read it as `foo4` returns something deprecated (as opposed to `foo4` itself is deprecated). So the use of decorator can be preferred for methods/functions.
1) To me, "the function I am calling is deprecated" does not seem to violate any typing-related constraints, at least not in a traditional sense. We are not doing stuffs like passing an int to a function that expects a string or having an argument mismatch issues, which all lead to runtime errors. No soundness issues are involved here, and I'm not sure deprecation detection belongs to that camp. Of course, it does not mean that type systems cannot be extended to catch more issues, but for extensions like this I would generally expect more precise descriptions of the proposed system -- e.g. what are the additional typing constraints you want to introduce (like when and where should a type checker reports this "deprecation" error?), how does the new type interacts with pre-existing types (like how it affects subtyping and what are the meanings/behaviors of `Union[int, Annotated[str, Deprecated(...)]` or `Dict[str, Iterable[Annotated[int, Deprecated(...)]]]`?), etc.
I'm agreeing on the point that we should not require type checkers to do the _actual reporting_ to users. Type checkers can choose to do so, but shouldn't be required to report. The main requirement is for the type checkers to understand the proposed new addition to the type system. re subtyping: (assuming we end up using typing.Deprecated), we could require that typing.Deprecated must be used at the outer most place. And typing.Deprecated isn't transferrable, meaning: ```python # foo.py: value: typing.Deprecated[int] = 1 # bar.py: import foo value = foo.value # The type of bar should be int, not typing.Deprecated[int] items = {'foo': foo.value} # The type of items should be Dict[str, int], not Dict[str, typing.Deprecated[int]] # user.py: import foo import bar _ = foo.value # This should be reported as use of deprecated API _ = bar.value # This should not be reported as use of deprecated API _ = bar.items # This should not be reported as use of deprecated API ```
2) Let's assume that we end up deciding we want to support deprecation detection in the type system. In that case, I would still argue against relying on `Annotated` for the job. My reading of PEP 593 is that the interpretation of `Annotated` is meant to be tooling-specific -- type checkers themselves may not want to assign specific meanings to those annotations as those meanings may create conflicts with downstream toolings. If you want to standardize your extensions to the type system, I personally think it would be much cleaner to just add new classes to stdlib (e.g. something like `typing.Deprecated[T]`?), which can do whatever `Annotated` can do but without the issues.
Agreed too. If we end up a proposal for everyone to adopt, it shouldn't use Annotated. I'm coming from an angle that we want to first experiment with this in pytype, and for other type checkers, at least the use of such annotations shouldn't cause issues if they don't want to adopt.
3) Let's assume that we end up adopting your approach. In that case, I would propose 2 small revisions to your `deprecated` API. The first one is that Pyre does not really want to support unbounded `Callable[[T], T]` in return type. See https://github.com/facebook/pyre-check/issues/329 . It would be nice if it can be rewritten into a callback protocol. The second one is that the `deprecated` function can be more fully typed with ParamSpec introduced in PEP 612. There's also this issue you mentioned in your message, but that's something to be fixed on the Pyre side.
That's a good suggestion, thanks!
4) Finally, let me also propose an alternative approach where the check gets implemented not inside the type system but as a (tool-specific) downstream linter instead. The idea is to have type checker expose an interface that returns all of the callsites it knows, and then let the linter go over them: for each of them see if the possible callees resides in a pre-determined deprecation list (which can be obtained by looking for syntactical hints around functions/classes/globals such as decorators, or even just a pre-defined list of names). If we do it like this, no standardized type system change is needed. The only thing that may need to be standardized here is the form of syntactical hints, which would be much lighter-weight than a standardized type system extension.
What would this form of syntactical hints be though? And how it would be lighter-weight than say, a typing.Deprecated type?