If a project does R releases per year for P platforms that need to support V versions of Python, they would normally have to build R * P * V wheels. With a stable ABI, they could reduce that to R * P. That's the key point, right?
Can HPy do that?
actually, it can do even better than that. When you compile an HPy extension you can choose which ABI to target: - CPython ABI: in this modality, all HPy_* calls are statically translated (using static inline functions) into the corresponding Py_* call, and it generates modules like foo.cpython-38-x86_64-linux-gnu.so, undistinguishable from a "normal" module
- HPy universal ABI: in this modality, it generates something like foo.hpy-x86_64-linux-gnu.so: all API calls are done through the HPyContext (which is basically a giant vtable): this module can be loaded by any implementation which supports the HPy universal ABI, including CPython, PyPy and GraalPython.
The main drawback of the universal ABI is that it's slightly slower because it goes through the vtable indirection for every call, in particular HPy_Dup/HPy_Close which are mapped to Py_INCREF/Py_DECREF. Some early benchmark indicate a 5-10% slowdown. We haven't benchmarked it against the stable ABI though.
Of course, in order to be fully usable, the HPy universal ABI will need special support by PyPI/pip/etc, because at the moment it is impossible to package it inside a wheel, AFAIK.