[SciPy-user] Arrays and strange memory usage ...

David Cournapeau david at ar.media.kyoto-u.ac.jp
Tue Sep 2 21:19:22 EDT 2008


Robert Kern wrote:
>
> No, the default is int (int32 on 32-bit systems, int64 on most 64-bit
> systems) if you give it integer arguments and float64 if you give it
> float arguments.
>   

Ah, my bad, I should have thought about the difference between 1e6 and
1000000.

>   
>>> It gets even worse with complex float. I tried :
>>> z = arange(1000000) + 1j*arange(1000000)
>>>
>>> Expecting 8 Mb,
>>>       
>> Again, this is strange, it should default to float128. Which version
>> of numpy/scipy are you using ?
>>     
>
> You mean complex128.
>   

Yes; I just wanted to point out that 1j*arange(1000000) is expected to
take ~16Mb, not 8.

> One thing to be aware of is that there are temporaries involved.
> 1j*arange(1000000) will allocate almost 16 MB of memory just by itself
> and then allocate another 16 MB for the result of the addition. The
> memory may not get returned to the OS when an object gets deallocated
> although it will be reused by Python.
>   

I think on linux, for those sizes, the memory is given back to the OS
right away because it is above the mmap threshold, and free gives the
memory right away in those cases. Since I see the exact same behavior as
you on top (b = np.arange(1e6) + 1.j np.arange(1e6) only adding 16 Mb),
maybe the Mac OS X malloc does something similar.

cheers,

David



More information about the SciPy-User mailing list