Large data arrays?
aahz at pythoncraft.com
Thu Apr 23 21:48:12 CEST 2009
In article <ytziqkvacul.fsf at burgos.aip.de>,
Ole Streicher <ole-usenet-10 at gmx.net> wrote:
>for my application, I need to use quite large data arrays
>(100.000 x 4000 values) with floating point numbers where I need a fast
>row-wise and column-wise access (main case: return a column with the sum
>over a number of selected rows, and vice versa).
>I would use the numpy array for that, but they seem to be
>memory-resistent. So, one of these arrays would use about 1.6 GB
>memory which far too much. So I was thinking about a memory mapped
>file for that. As far as I understand, there is one in numpy.
You probably want to ask on a NumPy or SciPy list:
Aahz (aahz at pythoncraft.com) <*> http://www.pythoncraft.com/
"If you think it's expensive to hire a professional to do the job, wait
until you hire an amateur." --Red Adair
More information about the Python-list