[issue14596] struct.unpack memory leak

Robert Elsner report at bugs.python.org
Mon Apr 16 16:21:31 CEST 2012


Robert Elsner <robert.elsner2 at googlemail.com> added the comment:

Well I stumbled across this leak while reading big files. And what is
the point of having a fast C-level unpack when it can not be used with
big files?
I am not adverse to the idea of caching the format string but if the
cache grows beyond a reasonable size, it should be freed. And
"reasonable" is not the number of objects contained but the amount of
memory it consumes. And caching an arbitrary amount of data (8GB here)
is a waste of memory.

And reading the Python docs, struct.Struct.unpack which is _not_
affected from the memory leak is supposed to be faster. Quote:

> class struct.Struct(format)
> 
> Return a new Struct object which writes and reads binary data according to the format string format. Creating a Struct object once and calling its methods is more efficient than calling the struct functions with the same format since the format string only needs to be compiled once.

Caching in case of struct.Struct is straightforward: As long as the
object exists, the format string is cached and if the object is no
longer accessible, its memory gets freed - including the cached format
string. The problem is with the "magic" creation of struct.Struct
objects by struct.unpack that linger around even after all associated
variables are no longer in scope.

Using for example fixed 1MB buffer to read files (regardless of size)
incurs a huge performance penalty. Reading everything at once into
memory using struct.unpack (or with the same speed struct.Struct.unpack)
is the fastest way. Approximately 40% faster than array.fromfile and and
70% faster than numpy.fromfile.

I read some unspecified report about a possible memory leak in
struct.unpack but the author did not investigate further. It took me
quite some time to figure out what exactly happens. So there should be
at least a warning about this (ugly) behavior when reading big files for
speed and a pointer to a quick workaround (using struct.Struct.unpack).

cheers

Am 16.04.2012 15:59, schrieb Antoine Pitrou:
> 
> Antoine Pitrou <pitrou at free.fr> added the comment:
> 
>> Perhaps the best quick fix would be to only cache small
>> PyStructObjects, for some value of 'small'.  (Total size < a few
>> hundred bytes, perhaps.)
> 
> Or perhaps not care at all? Is there a use case for huge repeat counts?
> (limiting cacheability could decrease performance in existing
> applications)
> 
> ----------
> 
> _______________________________________
> Python tracker <report at bugs.python.org>
> <http://bugs.python.org/issue14596>
> _______________________________________

----------

_______________________________________
Python tracker <report at bugs.python.org>
<http://bugs.python.org/issue14596>
_______________________________________


More information about the Python-bugs-list mailing list