memory usage

Robert Toop toop82 at comcast.net
Sat Dec 6 22:18:29 EST 2003


"Dave" <jaguar4096 at yahoo.com> wrote in message
news:c375aec2.0312052020.6e26239c at posting.google.com...
> Hi,
> I'm writing code which deals with lists of millions to tens of
> millions of
> elements.  It seems to me that python gobbles up memory faster than it
> should in these cases.
>
> For example, I start the python interpreter and then execute the
> following code:
>
> >>> f = []
> >>> for i in range(1e7):
> ...    f.append(0.1)
>
> In another window on my Linux system I then run top.  The python
> process is now using 155 MB of memory.  Why is python using such a
> large amount of memory, nearly nearly 16 bytes for a float in this
> case?

8 bytes for each float, and I'm guessing 8 bytes for 2 pointers to form the
list.
One needs 2 pointers in order to facilitate nearby forward/backward
reference,
and to be able to remove an element without having to search from the start.
An extensible list cannot be a pointer-free contiguous array unless all
elements
are copied each time you append to it.

Access to a linked list like this requires not only extra memory,
but extra time, to say nothing of large RAM to avoid virtual memory paging.
You'd be better off with static-dimensioned (hence contiguous) arrays.
Of course, that might be impractical for your app.

If your lists require dynamic extension and sparse random access,
you'd be better off with an indexed tree (database-like) storage scheme.
I'm too new to python to know if such packages exist, but I'll bet they do.






More information about the Python-list mailing list