Look up Judy arrays. Specialized data structure for this.  It's got a simple API, but it's incredibly complex and architecture-specific under the hood.

People are always trying to optimize, but there are limits to how much you can do on generic data structures (and how much you can do in general), and even specialized data structures often only help in specialized workloads.

On Sun, Mar 27, 2022 at 1:18 PM Jonathan Fine <jfine2358@gmail.com> wrote:
Hi

Thank you Inada for your prompt and helpful reply. Here's a link for cached hash in bytes object: https://bugs.python.org/issue46864

What I have in mind is making selected objects smaller, for example by using smaller pointers. But how to know the performance benefit this will give?

I think it would be helpful to know how much SLOWER things are when we make Python objects say 8 or 16 bytes LARGER. This would give an estimate of the improvement from making all Python objects smaller.

I've not done much performance testing before. Anyone here interested in doing it, or helping me do it? (Warning - I've never built Python before.)

with best regards

Jonathan
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-leave@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/PMDNJFXMWXU3LQH5KXO4MM5SCGSP2J4P/
Code of Conduct: http://python.org/psf/codeofconduct/