On 05Mar2019 1103, Neil Schemenauer wrote:
On 2019-03-04, Carl Shapiro wrote:
[...] instances of types created in the C-API can be allocated outside of the Python heap. Would that feature be preserved? You could keep it with PyObject* but it might be harder to do with PyHandle.
I can only speak for myself but I would like to kill off non-heap allocated types. That's not easy because nearly every extension module that defines a type does so using a static structure for the new type (not heap allocated). Some discussion here: [SNIP] The related thing I would like to change is to force all PyObject structures to be allocated by CPython memory allocators.
I don't agree.
To be at all useful, I think your last sentence needs to be "force all PyObject structures to be allocated by *the single CPython memory allocator for the current runtime*". That means we don't need to store the deallocator function for each object, and can simply pass the memory blocks to a known allocator (even if that's been switched out at runtime startup, it won't have changed in the meantime).
However, in the context of features like NVRAM, GPU/CPU contexts, and even subinterpreters and subprocesses, I think there's a huge advantage in having objects know how to deallocate themselves. Without this, there's no way to support these more advanced concepts transparently. IMHO, that would be missing a huge opportunity.
(Of course, if Py_DECREF somehow became a per-object/per-class virtual function, then this becomes trivial. Even now, the dealloc function is per-type, and I don't think we'd gain anything by removing that, while what we gain from increasing it to be per-object could be significant.)
Cheers, Steve