Caching function pointers in type objects

In CPython, is it safe to cache function pointers that are in type objects? For example, if I know that some_type->tp_richcompare is non-NULL, and I call it (which may execute arbitrary user code), can I assume that some_type->tp_richcompare is still non-NULL? -- Daniel Stutzbach, Ph.D. President, Stutzbach Enterprises, LLC <http://stutzbachenterprises.com>

2010/3/2 Daniel Stutzbach <daniel@stutzbachenterprises.com>:
In CPython, is it safe to cache function pointers that are in type objects?
For example, if I know that some_type->tp_richcompare is non-NULL, and I call it (which may execute arbitrary user code), can I assume that some_type->tp_richcompare is still non-NULL?
Not unless it's builtin. Somebody could have deleted the rich comparison methods. -- Regards, Benjamin

I don't think this will help you solve your problem, but one thing we've done in unladen swallow is to hack PyType_Modified to invalidate our own descriptor caches. We may eventually want to extend that into a callback interface, but it probably will never be considered an API that outside code should depend on. Reid On Tue, Mar 2, 2010 at 9:57 PM, Benjamin Peterson <benjamin@python.org> wrote:
2010/3/2 Daniel Stutzbach <daniel@stutzbachenterprises.com>:
In CPython, is it safe to cache function pointers that are in type objects?
For example, if I know that some_type->tp_richcompare is non-NULL, and I call it (which may execute arbitrary user code), can I assume that some_type->tp_richcompare is still non-NULL?
Not unless it's builtin. Somebody could have deleted the rich comparison methods.
-- Regards, Benjamin _______________________________________________ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/reid.kleckner%40gmail.com

On Tue, Mar 2, 2010 at 9:06 PM, Reid Kleckner <rnk@mit.edu> wrote:
I don't think this will help you solve your problem, but one thing we've done in unladen swallow is to hack PyType_Modified to invalidate our own descriptor caches. We may eventually want to extend that into a callback interface, but it probably will never be considered an API that outside code should depend on.
Thanks Reid and Benjamin for the information. I think I see a way to dramatically speed up PyObject_RichCompareBool when comparing immutable, built-in, non-container objects (int, float, str, etc.). It would speed up list.sort when the key is one of those types, as well as most operations on the ubiquitous dictionary with str keys. Is that a worthwhile avenue to pursue, or is it likely to be redundant with Unladen Swallow's optimizations? If I can find time to pursue it, would it be best for me to implement it as a patch to Unladen Swallow, CPython trunk, or CPython py3k? -- Daniel Stutzbach, Ph.D. President, Stutzbach Enterprises, LLC <http://stutzbachenterprises.com>

2010/3/3 Daniel Stutzbach <daniel@stutzbachenterprises.com>:
I think I see a way to dramatically speed up PyObject_RichCompareBool when comparing immutable, built-in, non-container objects (int, float, str, etc.). It would speed up list.sort when the key is one of those types, as well as most operations on the ubiquitous dictionary with str keys.
Is that a worthwhile avenue to pursue, or is it likely to be redundant with Unladen Swallow's optimizations?
Perhaps you could explain what exactly you want to do. :) That would help us make a judgment.
If I can find time to pursue it, would it be best for me to implement it as a patch to Unladen Swallow, CPython trunk, or CPython py3k?
Your choice. -- Regards, Benjamin

On Wed, Mar 3, 2010 at 3:29 PM, Benjamin Peterson <benjamin@python.org>wrote:
2010/3/3 Daniel Stutzbach <daniel@stutzbachenterprises.com>:
I think I see a way to dramatically speed up PyObject_RichCompareBool when comparing immutable, built-in, non-container objects (int, float, str, etc.). It would speed up list.sort when the key is one of those types, as well as most operations on the ubiquitous dictionary with str keys.
(correcting myself) I just noticed that CPython already optimizes dictionaries with str-only keys and skips PyObject_RichCompareBool, so no speed up there. I think it would be redundant with optimization I have in mind, so it could perhaps be taken out which would simplify the dict code a bit and save a pointer in the dict structure.
Perhaps you could explain what exactly you want to do. :) That would help us make a judgment.
It'd actually be a pretty small patch, so I think I should just explain it by actually writing it. ;-) -- Daniel Stutzbach, Ph.D. President, Stutzbach Enterprises, LLC <http://stutzbachenterprises.com>

Hey Daniel, On Wed, Mar 3, 2010 at 1:24 PM, Daniel Stutzbach <daniel@stutzbachenterprises.com> wrote:
On Tue, Mar 2, 2010 at 9:06 PM, Reid Kleckner <rnk@mit.edu> wrote:
I don't think this will help you solve your problem, but one thing we've done in unladen swallow is to hack PyType_Modified to invalidate our own descriptor caches. We may eventually want to extend that into a callback interface, but it probably will never be considered an API that outside code should depend on.
Thanks Reid and Benjamin for the information.
I think I see a way to dramatically speed up PyObject_RichCompareBool when comparing immutable, built-in, non-container objects (int, float, str, etc.). It would speed up list.sort when the key is one of those types, as well as most operations on the ubiquitous dictionary with str keys.
That definitely sounds worth pursuing.
Is that a worthwhile avenue to pursue, or is it likely to be redundant with Unladen Swallow's optimizations?
I don't believe it will be redundant with the optimizations in Unladen Swallow.
If I can find time to pursue it, would it be best for me to implement it as a patch to Unladen Swallow, CPython trunk, or CPython py3k?
I would recommend patching py3k, with a backport to trunk. Thanks, Collin Winter

On Wed, Mar 3, 2010 at 4:34 PM, Collin Winter <collinwinter@google.com>wrote:
I would recommend patching py3k, with a backport to trunk.
After thinking it over, I'm inclined to patch trunk, so I can run the Unladen Swallow macro-benchmarks, then forward-port to py3k. I'm correct in understanding that it will be a while before the Unladen Swallow benchmarks can support Python 3, right? -- Daniel Stutzbach, Ph.D. President, Stutzbach Enterprises, LLC <http://stutzbachenterprises.com>

On Wed, Mar 3, 2010 at 2:41 PM, Daniel Stutzbach <daniel@stutzbachenterprises.com> wrote:
On Wed, Mar 3, 2010 at 4:34 PM, Collin Winter <collinwinter@google.com> wrote:
I would recommend patching py3k, with a backport to trunk.
After thinking it over, I'm inclined to patch trunk, so I can run the Unladen Swallow macro-benchmarks, then forward-port to py3k.
I'm correct in understanding that it will be a while before the Unladen Swallow benchmarks can support Python 3, right?
That's correct; porting the full benchmark suite to Python 3 will require projects like Django to support Python 3. Collin Winter
participants (4)
-
Benjamin Peterson
-
Collin Winter
-
Daniel Stutzbach
-
Reid Kleckner