[SciPy-user] Arrays and strange memory usage ...

christophe grimault christophe.grimault at novagrid.com
Tue Sep 2 13:11:22 EDT 2008


Hi,

I have a application that is very demanding in memory ressources. So I
started to to look closer at python + numpy/scipy as far as memory is
concerned. 

I can't explain the following :

I start my python, + import scipy. A 'top' in the console shows that :

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME COMMAND
14791 grimault  20   0 21624 8044 3200 S    0  0.4   0:00.43 python

Now after typing :

z = scipy.arange(1000000)

I get :
14791 grimault  20   0 25532  11m 3204 S    0  0.6   0:00.44 python

So the memory increased by ~ 7 Mb. I was expecting 4 Mb since the data
type is int32, giving 4*1000000 = 4 Mb of memory chunk (in C/C++ at
least).

It gets even worse with complex float. I tried : 
z = arange(1000000) + 1j*arange(1000000)

Expecting 8 Mb, since z.dtype gives "complex64", the "top" shows an
increase by 31 Mb.

This is very annoying. Can someone explain this ? Is there a way to
create numpy arrays with the same (approximately ! I know the array
class adds some overhead...) memory footprint as in C/C++ ?

Thanks in advance







More information about the SciPy-User mailing list