[Python-Dev] tracemalloc: add an optional memory limit

Victor Stinner victor.stinner at gmail.com
Mon Dec 9 02:56:26 CET 2013


Hi,

The PEP 454 (tracemalloc module) has been implemented in Python 3.4
beta 1. Previously, I also wrote a pyfailmalloc project to test how
Python behaves on memory allocation failures. I found various bugs
using this tool.

I propose to add an optional memory limit feature to the
tracemallocmodule. It would allow to set an arbitrary limit (ex: 100
MB) to check how your application behaves on systems with a few free
memory (ex: embedded devices).

What do you think of this enhancement? Would it fit Python 3.4
timeline? It's a new feature, but for a module which was introduced in
Python 3.4. I wanted to propose directly the feature in the PEP, but
Python was not ready for that (Python failed to handle correctly
memory allocation failures).

On Linux, it's possible to limit globally the "address space" of a
process, but this value is not convinient. For example, the address
space includes shared memory and read-only memory mappings of files,
whereas this memory should not reduce the memory available for other
applications. tracemalloc only counts memory directly allocated by
Python and so private memory which reduces directly the memory
available for other applications.

Technically, it's interesting to implement this feature in
tracemalloc. No new code should be added to count how many bytes were
allocated by Python, tracemalloc must already do that (indirectly).
It's simple to modify tracemalloc to make malloc() fails if an
allocation would make the counter greater than the limit.

I wrote a patch in the following issue to add an optional memory_limit
parameter to tracemalloc.start():
http://bugs.python.org/issue19817

The patch is short:

 Doc/library/tracemalloc.rst  |    6 +++-
 Lib/test/test_tracemalloc.py |   11 +++++++-
 Modules/_tracemalloc.c       |   56 ++++++++++++++++++++++++++++++++++++-------
 3 files changed, 63 insertions(+), 10 deletions(-)

--

I tried to fix all bugs in Python to handle correctly memory
allocation failures. When the memory limit is very low (less than 1024
bytes), there are still some corner cases where Python may crash.
These corner cases can be fixed. For example, I proposed a patch to
make PyErr_NoMemory() more reliable (which is probably the most
critical bug of remaining bugs):
http://bugs.python.org/issue19835

I fixed a reference leak in the unittest module (#19880) to reduce the
risk of PyErr_NoMemory() crash when I use the Python test suite to
test the memory limit feature (see issue #19817). I'm not sure that
PyErr_NoMemory() bug affects real applications.

Victor


More information about the Python-Dev mailing list