[Python-Dev] Investigating Python memory footprint of one real Web application

Victor Stinner victor.stinner at gmail.com
Fri Jan 20 06:17:13 EST 2017


2017-01-20 11:49 GMT+01:00 INADA Naoki <songofacandy at gmail.com>:
> Report is here
> https://gist.github.com/methane/ce723adb9a4d32d32dc7525b738d3c31

Very interesting report, thanks!

> My thoughts are:
>
> * Interning (None,) seems worth enough.

I guess that (None,) comes from constants of code objects:

>>> def f(): pass
...
>>> f.__code__.co_consts
(None,)


> * There are many empty dicts.  Allocating ma_keys lazily may reduce
> memory usage.

Would you be able to estimate how many bytes would be saved by this
change? With the total memory usage to have an idea of the %.

By the way, it would help if you can add the total memory usage
computed by tracemalloc (get_traced_memory()[0]) in your report.

About empty dict, do you expect that they come from shared keys?
Anyway, if it has a negligible impact on the performance, go for it
:-)


> but other namespaces or annotations, like ('return',) or ('__wrapped__',) are not shared

Maybe we can intern all tuple which only contains one string?

Instead of interning, would it be possible to at least merge
duplicated immutable objects?


> * Most large strings are docstring.  Is it worth enough that option
> for trim docstrings, without disabling asserts?

Yeah, you are not the first one to propose. The problem is to decide
how to name the .pyc file.

My PEP 511 proposes to add a new -O command line option and a new
sys.implementation.optim_tag string to support this feature:
https://www.python.org/dev/peps/pep-0511/

Since the code transformer part of the PEP seems to be controversal,
maybe we should extract only these two changes from the PEP and
implement them? I also want -O noopt :-) (disable the peephole
optimizer)

Victor


More information about the Python-Dev mailing list