[Python-Dev] Re: Activating pymalloc
Vladimir Marangozov
vladimir.marangozov@optimay.com
Fri, 15 Mar 2002 20:24:14 +0100
Hi guys,
I am trying to follow remotely python-dev from time to time
but I can't be very active as I used to be. Sorry about that.
Still, I can be reached at Vladimir.Marangozov@optimay.com
so if I can be of any help in the areas I was involved in,
you can always cc me.
I just browsed quickly this thread and I agree on almost all
issues:
1. The docs are not perfect and should be updated. Especially
what Tim mentioned -- adding a table summarizing the APIs
would be helpful. My original intention for the docs was to
strip off some internal details that the usual extension
writer doesn't have to know about. But Guido is right that
the comments in the .h files are the reference and they are
probably much clearer.
2. I agree that the macro chain (especially on the pymalloc side)
is not so useful at the end, so maybe all PyCore_ macros can be
removed. The function names of the allocator can be cast in
stone and then _THIS_xxx in obmalloc.c replaced with them.
3. I do not agree however that one would want to explicitly call
pymalloc. Access to the object allocator should be done through
PyObject_xxx, and to the memory allocator through PyMem_.
4. Originally, I excluded PyMem_ to use pymalloc because profiling
showed that more than 90% of all small allocations are object
allocations. So non-object allocations were either big (in which
case pymalloc is just overhead), either accounted for a very
small percentage which can be neglected. This is the reason why
pymalloc became obmalloc. Practical reasons.
5. About breaking extensions -- in early versions of pymalloc I had
debugging support built-in which basically detected when a block
allocated with malloc was free'd or realloc'd with pymalloc.
I used this at the time to cleanup the baseline and the standard
extensions shipped with 2.0 from mixed API usage. After that was
done, though, I removed this as I thought that it won't be of
much help. Apparently, I was wrong and perhaps this debugging
functionality would be helpful. Still, the way to go is to fix
the 'faulty' modules and document better the concepts. Yes, 1.5.2
would be the primary victims here but the memory work was done
post 1.5.2 and people can't stop Python from evolving.
6. To be honest, it escaped my eyes too that PyObject_NEW is no faster
than PyObject_New. Yes -- redirect the macros to the functions.
More generally, as a historical recap, the goal originally was to
introduce the core concepts on Python heap & Python object heap
through the APIs which was a precondition for any python-specific
allocator work. Once this was done, the integration of a python
malloc had to be as seemless as possible, the switching to
its use as well, and this is probably where some of the macro
layers were introabused (in my eyes, not a big deal anyway). So look
at it on the bright side -- the goals are now almost achieved :-)
Anything else I can help you with?
Vladimir