[Numpy-discussion] [Fwd: Large array using 4 times as much memory as it should]

Rick Giuly rgiuly at gmail.com
Thu Oct 30 21:48:01 EDT 2008


Please disregard the last message.

I just realised 8*4=32, very sorry about this. Apparently I need some sleep.

-Rick

-------- Original Message --------
Subject: Large array using 4 times as much memory as it should
Date: Thu, 30 Oct 2008 18:41:44 -0700
From: Rick Giuly <rgiuly at gmail.com>
To: numpy-discussion at scipy.org

Hello All,

I find that python is using about four times as much memory as it should
need for arrays. This is problematic as I need to use all available
memory for large 3D imaging datasets. Is there a way to get around this
problem? Am I making a mistake? Is it a bug?

(I'm running windowsXP 32bit with 760M of memory "Available" according
to the "Performance" pane of the task manager.)

versions: numpy 1.2.0 with python 2.5.2

Any help is appreciated


-Rick




**************************
Details of my testing:

Each test was run from the command line and for each test python was
restarted.

Testing a 50M array:
a = numpy.ones((1024,1024,50), dtype=numpy.uint32)
The available memory dropped by 200M


Testing a 100M array:
a = numpy.ones((1024,1024,100), dtype=numpy.uint32)
The available memory dropped by 400M


Testing a 200M array:
a = numpy.ones((1024,1024,200), dtype=numpy.uint32)
The available memory dropped by 750M


Testing a 300M array:
a = numpy.ones((1024,1024,300), dtype=numpy.uint32)
an error occurs:
Traceback (most recent call last):
   File "<stdin>", line 1, in <module>
   File
"o:\software\pythonxy\python\lib\site-packages\numpy\core\numeric.py", li
ne 1445, in ones
     a = empty(shape, dtype, order)
MemoryError










More information about the NumPy-Discussion mailing list