The past few weeks I've been working on adding a C extension to the
reference implementation for PEP 615 (Support for the IANA Time Zone
Database in the Standard Library - PEP link
<https://www.python.org/dev/peps/pep-0615/>), but I've had some trouble
inducing anyone to review the code
about 2200 lines of C code there that's passing all the tests for the
pure Python implementation (which has 100% code coverage) - would anyone
be interested in taking a look before I merge it into the reference
implementation repo? Any little bit of review (including incremental
reviews) would help:
Note: You'll need to click "Load diff" on the zoneinfo_module.c file to
see the actual code, because the diff is so large. The stuff that's
immediately visible in the diff is the more straightforward code.
I'll note that I am also happy to accept review comments about other
parts of the repo, not just the C code, but the C code is a priority
since errors there tend to be less forgiving.
So over on https://bugs.python.org/issue22699 I've been talking to
myself as I figure out all the ways that cross-compiling (on Ubuntu,
targeting another Linux distro via an SDK) fails.
In short, it's either because sysconfig can't provide details about any
CPython install other than the currently running one, or it's because
our setup.py (for building the CPython extension modules) uses sysconfig.
Either way, I'm not about to propose a rewrite to fix either of them
without finding those who are most involved in these areas. Please come
join me on the bug.
Hello all! I'm a relatively new contributor looking for a Core Dev sponsor for the following PEP:
- Python-Ideas Thread: https://email@example.com/thread/RJARZS…
- Bug Tracker Issue: https://bugs.python.org/issue39939
- Github PR for implementation: https://github.com/python/cpython/pull/18939
This is a proposal to add two new methods, ``cutprefix`` and
``cutsuffix``, to the APIs of Python's various string objects. In
particular, the methods would be added to Unicode ``str`` objects,
binary ``bytes`` and ``bytearray`` objects, and
If ``s`` is one these objects, and ``s`` has ``pre`` as a prefix, then
``s.cutprefix(pre)`` returns a copy of ``s`` in which that prefix has
been removed. If ``s`` does not have ``pre`` as a prefix, an
unchanged copy of ``s`` is returned. In summary, ``s.cutprefix(pre)``
is roughly equivalent to ``s[len(pre):] if s.startswith(pre) else s``.
The behavior of ``cutsuffix`` is analogous: ``s.cutsuffix(suf)`` is
roughly equivalent to
``s[:-len(suf)] if suf and s.endswith(suf) else s``.
Antoine Pitrou [mailto:firstname.lastname@example.org]:
> This example is mixing up the notion of interpreter state and thread state.
Sorry about that, I was making a more general point and not paying so much
attention to the specific names.
The general point is this: A thread-local variable can work as an implied
parameter without creating synchronisation problems. You do not have to worry
about painting yourself into an architectural corner by using thread-local
variables, because anything you can do with an explicit parameter, you could
also have done with an implied one in thread-local memory.
Correctness is not at stake. Either approach will work.
Read in the browser: https://www.python.org/dev/peps/pep-0585/ <https://www.python.org/dev/peps/pep-0585/>
Read the source: https://raw.githubusercontent.com/python/peps/master/pep-0585.rst <https://raw.githubusercontent.com/python/peps/master/pep-0585.rst>
The following PEP has been discussed on typing-sig already and a prototype implementation exists for it. I'm extending it now for wider feedback on python-dev, with the intent to present the final version for the Steering Council's consideration by mid-March.
Title: Type Hinting Generics In Standard Collections
Author: Łukasz Langa <lukasz(a)python.org>
Discussions-To: Typing-Sig <typing-sig(a)python.org>
Type: Standards Track
Static typing as defined by PEPs 484, 526, 544, 560, and 563 was built incrementally on top of the existing Python runtime and constrained by existing syntax and runtime behavior. This led to the existence of a duplicated collection hierarchy in the typing module due to generics (for example typing.List and the built-in list).
This PEP proposes to enable support for the generics syntax in all standard collections currently available in the typing module.
Rationale and Goals
This change removes the necessity for a parallel type hierarchy in the typing module, making it easier for users to annotate their programs and easier for teachers to teach Python.
Generic (n.) -- a type that can be parametrized, typically a container. Also known as a parametric type or a generic type. For example: dict.
Parametrized generic -- a specific instance of a generic with the expected types for container elements provided. Also known as a parametrized type. For example: dict[str, int].
Tooling, including type checkers and linters, will have to be adapted to recognize standard collections as generics.
On the source level, the newly described functionality requires Python 3.9. For use cases restricted to type annotations, Python files with the "annotations" future-import (available since Python 3.7) can parametrize standard collections, including builtins. To reiterate, that depends on the external tools understanding that this is valid.
Starting with Python 3.7, when from __future__ import annotations is used, function and variable annotations can parametrize standard collections directly. Example:
from __future__ import annotations
def find(haystack: dict[str, list[int]]) -> int:
Usefulness of this syntax before PEP 585 <https://www.python.org/dev/peps/pep-0585> is limited as external tooling like Mypy does not recognize standard collections as generic. Moreover, certain features of typing like type aliases or casting require putting types outside of annotations, in runtime context. While these are relatively less common than type annotations, it's important to allow using the same type syntax in all contexts. This is why starting with Python 3.9, the following collections become generic using __class_getitem__() to parametrize contained types:
tuple # typing.Tuple
list # typing.List
dict # typing.Dict
set # typing.Set
frozenset # typing.FrozenSet
type # typing.Type
collections.abc.Set # typing.AbstractSet
contextlib.AbstractContextManager # typing.ContextManager
contextlib.AbstractAsyncContextManager # typing.AsyncContextManager
re.Pattern # typing.Pattern, typing.re.Pattern
re.Match # typing.Match, typing.re.Match
Importing those from typing is deprecated. Due to PEP 563 <https://www.python.org/dev/peps/pep-0563> and the intention to minimize the runtime impact of typing, this deprecation will not generate DeprecationWarnings. Instead, type checkers may warn about such deprecated usage when the target version of the checked program is signalled to be Python 3.9 or newer. It's recommended to allow for those warnings to be silenced on a project-wide basis.
The deprecated functionality will be removed from the typing module in the first Python version released 5 years after the release of Python 3.9.0.
Parameters to generics are available at runtime
Preserving the generic type at runtime enables introspection of the type which can be used for API generation or runtime type checking. Such usage is already present in the wild.
Just like with the typing module today, the parametrized generic types listed in the previous section all preserve their type parameters at runtime:
>>> tuple[int, ...]
>>> ChainMap[str, list[str]]
This is implemented using a thin proxy type that forwards all method calls and attribute accesses to the bare origin type with the following exceptions:
the __repr__ shows the parametrized type;
the __origin__ attribute points at the non-parametrized generic class;
the __args__ attribute is a tuple (possibly of length 1) of generic types passed to the original __class_getitem__;
the __parameters__ attribute is a lazily computed tuple (possibly empty) of unique type variables found in __args__;
the __getitem__ raises an exception to disallow mistakes like dict[str][str]. However it allows e.g. dict[str, T][int] and in that case returns dict[str, int].
This design means that it is possible to create instances of parametrized collections, like:
>>> l = list[str]()
>>> list is list[str]
>>> list == list[str]
>>> list[str] == list[str]
>>> list[str] == list[int]
>>> isinstance([1, 2, 3], list[str])
TypeError: isinstance() arg 2 cannot be a parametrized generic
>>> issubclass(list, list[str])
TypeError: issubclass() arg 2 cannot be a parametrized generic
>>> isinstance(list[str], types.GenericAlias)
Objects created with bare types and parametrized types are exactly the same. The generic parameters are not preserved in instances created with parametrized types, in other words generic types erase type parameters during object creation.
One important consequence of this is that the interpreter does not attempt to type check operations on the collection created with a parametrized type. This provides symmetry between:
l: list[str] = 
l = list[str]()
For accessing the proxy type from Python code, it will be exported from the types module as GenericAlias.
Pickling or (shallow- or deep-) copying a GenericAlias instance will preserve the type, origin, attributes and parameters.
Future standard collections must implement the same behavior.
A proof-of-concept or prototype implementation <https://bugs.python.org/issue39481> exists.
Keeping the status quo forces Python programmers to perform book-keeping of imports from the typing module for standard collections, making all but the simplest annotations cumbersome to maintain. The existence of parallel types is confusing to newcomers (why is there both list and List?).
The above problems also don't exist in user-built generic classes which share runtime functionality and the ability to use them as generic type annotations. Making standard collections harder to use in type hinting from user classes hindered typing adoption and usability.
It would be easier to implement __class_getitem__ on the listed standard collections in a way that doesn't preserve the generic type, in other words:
>>> tuple[int, ...]
>>> collections.ChainMap[str, list[str]]
This is problematic as it breaks backwards compatibility: current equivalents of those types in the typing module do preserve the generic type:
>>> from typing import List, Tuple, ChainMap
>>> Tuple[int, ...]
>>> ChainMap[str, List[str]]
As mentioned in the "Implementation" section, preserving the generic type at runtime enables runtime introspection of the type which can be used for API generation or runtime type checking. Such usage is already present in the wild.
Additionally, implementing subscripts as identity functions would make Python less friendly to beginners. Say, if a user is mistakenly passing a list type instead of a list object to a function, and that function is indexing the received object, the code would no longer raise an error.
>>> l = list
TypeError: 'type' object is not subscriptable
With __class_getitem__ as an identity function:
>>> l = list
The indexing being successful here would likely end up raising an exception at a distance, confusing the user.
Disallowing instantiation of parametrized types
Given that the proxy type which preserves __origin__ and __args__ is mostly useful for runtime introspection purposes, we might have disallowed instantiation of parametrized types.
In fact, forbidding instantiation of parametrized types is what the typing module does today for types which parallel builtin collections (instantiation of other parametrized types is allowed).
The original reason for this decision was to discourage spurious parametrization which made object creation up to two orders of magnitude slower compared to the special syntax available for those builtin collections.
This rationale is not strong enough to allow the exceptional treatment of builtins. All other parametrized types can be instantiated, including parallels of collections in the standard library. Moreover, Python allows for instantiation of lists using list() and some builtin collections don't provide special syntax for instantiation.
Making isinstance(obj, list[str]) perform a check ignoring generics
An earlier version of this PEP suggested treating parametrized generics like list[str] as equivalent to their non-parametrized variants like list for purposes of isinstance() and issubclass(). This would be symmetrical to how list[str]() creates a regular list.
This design was rejected because isinstance() and issubclass() checks with parametrized generics would read like element-by-element runtime type checks. The result of those checks would be surprising, for example:
>>> isinstance([1, 2, 3], list[str])
Note the object doesn't match the provided generic type but isinstance() still returns True because it only checks whether the object is a list.
If a library is faced with a parametrized generic and would like to perform an isinstance() check using the base type, that type can be retrieved using the __origin__ attribute on the parametrized generic.
Making isinstance(obj, list[str]) perform a runtime type check
This functionality requires iterating over the collection which is a destructive operation in some of them. This functionality would have been useful, however implementing the type checker within Python that would deal with complex types, nested type checking, type variables, string forward references, and so on is out of scope for this PEP.
Note on the initial draft
An early version of this PEP discussed matters beyond generics in standard collections. Those unrelated topics were removed for clarity.
This document is placed in the public domain or under the CC0-1.0-Universal license, whichever is more permissive.
I think that PEP 573 is ready to be accepted, to greatly improve the state
of extension modules in CPython 3.9.
It has come a long way since the original proposal and went through several
iterations and discussions by various interested people, effectively
reducing its scope quite a bit. So this is the last call for comments on
the latest version of the PEP, before I will pronounce on it. Please keep
the discussion in this thread.
An unfortunate side-effect of making PyInterpreterState in Python 3.8 opaque is it removed [PEP 523](https://www.python.org/dev/peps/pep-0523/) support. https://www.python.org/dev/peps/pep-0523/ was opened to try and fix this, but there seems to be a stalemate in the issue.
A key question is at what API level should setting the frame evaluation function live at? No one is suggesting the stable ABI, but there isn't agreement between CPython or the internal API (there's also seems to be a suggestion to potentially remove PEP 523 support entirely).
And regardless of either, there also seems to be a disagreement about providing getters and setters to continue to try and hide PyInterpreterState regardless of which API level support is provided at (if any).
If you have an opinion please weight in on the issue.