Can we make `PyLong_AsByteArray(PyLongObject* v, unsigned char* bytes, size_t n, int little_endian, int is_signed);` and `PyObject * PyLong_FromByteArray(const unsigned char* bytes, size_t n, int little_endian, int is_signed);` available as publicly documented functions?
They're already available internally, as can be seen here (https://github.com/python/cpython/blob/578c3955e0222ec7b3146197467fbb0fcfae…) so all that would be required (I think) would be exporting the symbols and updating the documentation (https://docs.python.org/3/c-api/long.html).
The benefits would be that we could use this when we want to use fixed width integer types without having to check the width of fundamental integer types (so we could use `int32_t` rather than checking `sizeof(int)` or the corresponding macro). This would also be useful if someone wanted to use compiler extension fixed width types such as `__int128_t` (and the corresponding unsigned version).
I have been following the discussions on removing deprecated
macros and functions from the C API. I checked my code, PyCXX
and have a coupld of issues that is easy to fix. But I may have
What would help me is to be able to build extensions with
all the deprecated pieces of the C API removed.
A bit like the way the limited C API works via the
What do you think?
On 2020-06-14 22:10, cpython(a)nicwatson.org wrote:
> Please excuse if this is the wrong mailing list. I couldn't find one for module maintainers.
This is relevant to capi-sig(a)python.org; let's continue here.
> I maintain an open source Python module in C. I'm trying to verify for the first time that the module still works with cpython 3.9. This module does *not* use the "limited" C API.
> In building my module against 3.9b3, I'm getting a missing declaration warning on _Py_ForgetReference. My module builds and passes test fine, this is just a compiler warning issue.
What does the _Py_ForgetReference function do? The [documentation] says
it's only for use in the interpereter core, so I'd assume it's .
> The change that caused this was made in:
> commit f58bd7c1693fe041f7296a5778d0a11287895648
> Author: Victor Stinner <vstinner(a)python.org>
> Date: Wed Feb 5 13:12:19 2020 +0100
> bpo-39542: Make PyObject_INIT() opaque in limited C API (GH-18363)
> I definitely need the _Py_ForgetReference call for a particularly hairy error condition (https://github.com/jnwatson/py-lmdb/blob/master/lmdb/cpython.c#L888 if you're curious). In fact, my tests will seg fault if I don't have that call and trace refs is enabled.
I can't follow the reasoning behind the code easily. Why do you use
_Py_ForgetReference and PyObject_Del, instead of Py_DECREF(self)?
> Should I put an #ifdef Py_TRACE_REFS around the call? Ignore it? What do you think is the proper resolution?
The Structure interface of ctypes is intended to be subclassed yet the
subclasses init and new aren't invoked. I think this was intended as an
optimization but it is inconsistent and confusing. The only documentation
about this behavior takes the form of a unit test, which may not have ever
There is an issue about this open here https://bugs.python.org/issue38860
I'm of the opinion it should not be "special", what does the community
Since there is a discussion about the limited API and headers. A long
time ago I asked about this on python-dev:
It concerns me a bit that the limited API does not seem to support
MetaClasses in PyType_FromSpec (or FromSpecWithBase).
In particular, last time I checked, it seemed impossible to extend the
type struct within such a metaclass .
There also seemed to be an issue with PyType_FromSpecWithBases
allocating exactly a base type-object, instead of using `tp_alloc` as
This and how to efficiently cache Python objects in methods are the two
main concerns I have right now for long-term full adoption of the
limited API. (There are other difficulties, but for the ones I am
aware, I at least have an idea how to they need to be addressed).
I suppose it may actually be possible to hack around the FromSpec
issues, although I am not sure right now. unless subclassing
PyType_Type in C is for some reason different and very much
 And e.g. PySide seemed to "do" this, and got away with it, because
there is an additional NULL at the end, as a termination for slots or
so. Which for them is unused and they just happen to use.
Now that PEP 573 (Module State Access from C Extension Methods) is in,
it's time to address the most important part missing from it: module
state access from slot methods (like tp_init or nb_add).
This has long history; see e.g. a thread I started in 2015:
Most recently, a possible solution, an MRO walker, was one of the last
things removed from PEP 573 before acceptance, because the proposed
solution was broken: see
Here's the removed text:
> Slot methods
> To allow access to `per-module state`_ from slot methods, an MRO walker
> will be implemented::
> PyTypeObject *PyType_DefiningTypeFromSlotFunc(PyTypeObject *type,
> int slot, void *func)
> The walker will go through bases of heap-allocated ``type``
> and search for class that defines ``func`` at its ``slot``.
> The ``func`` does not need to be inherited by ``type`` (i.e. it may have been
> overridden in a subclass). The only requirement for the walker to find the
> defining class is that the defining class must be heap-allocated.
> On failure, exception is set and NULL is returned.
and the last iteration of the implementation is here:
A MRO walker is not efficient. The other viable options I'm aware of are:
- putting a pointer in the class object (which would need reliable
class-level storage, see
- solutions that aren't fully general, like context variables or putting
a pointer in each instance (thorny because constructors/initializers are
- "__typeslots__" mentioned in
which are, IMO, quite invasive
Given that, I believe that some kind of MRO walker is still the best
immediate solution, mainly because it's easiest to implement. If done
well, it can only use current public API (limiting the maintenance
burden you'd have with something wired into the internals, and allowing
it to be copied to 3rd party code to support older Python versions), and
should be easy to replace with a more performant solution once that
And if it's added, more modules could switch to the stable ABI and
eschew process-global state, so we get more exposure and experience with
the remaining problems.
Now, the main problem with "PyType_DefiningTypeFromSlotFunc" is that it
looked slot functions, which can be set from Python code, and doing so
can easily introduce C-level failures.
To solve this, my next thought is instead looking for something only
available in C: the PyModuleDef.
Here's an implementation of such an MRO walker, which I call
My thinking is to start using this in 3.10 as internal API to get a feel
for its strengths and weaknesses.
It should be easy to rewrite the function to only use public API, and
added to another project (like Cython). It should also be easy to make
it public once we're sure it's the way forward.
What are your thoughts?
As I wrote in a previous email, the C API of CPython is evolving
slowly to hide more and more implementation details. Sadly, some
changes break the backward compatibility. Sometimes, a new function
must be used, except that the function doesn't exist in older Python
"Update on C API changes to hide implementation details"
Would it be possible to ship a header file (a single ".h" file) and/or
even a dynamic library to provide new functions to old Python
versions? I'm not sure if it's possible to ship a header file as a
package on PyPI and make it available for C compilers easily without
monkey patching distutils. Another option is to encourage Linux
distributions to package such header file to make it easy to install
on the system.
Another (worse?) option is to advise authors of each C extension
module to vendor a copy of the header file in their project and update
it from time to time when needed.
At least in C, we can provide most "missing" functions on older Python
versions as macros or static inline functions. Python 3.6 now requires
a subset C99 with static inline functions.
The situation is very similar to the Python 2 to Python 3 transition.
Benjamin Peterson wrote the "six" which became quickly very popular.
First, projects vendored a copy of the "six" module as a submodule of
their project (src/project/six.py, imported as "project.six"). Then,
slowly, the "six" module was shipped by Linux distributions or
installed via "pip install six".
Cython already does that internally. A few examples:
#if PY_VERSION_HEX < 0x030200A4
typedef long Py_hash_t;
#define __Pyx_PyInt_FromHash_t PyInt_FromLong
#define __Pyx_PyInt_AsHash_t __Pyx_PyIndex_AsHash_t
#define __Pyx_PyInt_FromHash_t PyInt_FromSsize_t
#define __Pyx_PyInt_AsHash_t __Pyx_PyIndex_AsSsize_t
#define __Pyx_PyThreadState_Current PyThreadState_Get()
#define __Pyx_PyThreadState_Current PyThreadState_GET()
#elif PY_VERSION_HEX >= 0x03060000
//#elif PY_VERSION_HEX >= 0x03050200
// Actually added in 3.5.2, but compiling against that does not
guarantee that we get imported there.
#define __Pyx_PyThreadState_Current _PyThreadState_UncheckedGet()
#elif PY_VERSION_HEX >= 0x03000000
#define __Pyx_PyThreadState_Current PyThreadState_GET()
#define __Pyx_PyThreadState_Current _PyThreadState_Current
#if PY_VERSION_HEX >= 0x030900A4
#define __Pyx_SET_REFCNT(obj, refcnt) Py_SET_REFCNT(obj, refcnt)
#define __Pyx_SET_SIZE(obj, size) Py_SET_SIZE(obj, size)
#define __Pyx_SET_REFCNT(obj, refcnt) Py_REFCNT(obj) = (refcnt)
#define __Pyx_SET_SIZE(obj, size) Py_SIZE(obj) = (size)
Night gathers, and now my watch begins. It shall not end until my death.