More efficient array processing
John [H2O]
washakie at gmail.com
Thu Oct 23 14:11:32 EDT 2008
Hello,
I'm trying to do the following:
datagrid = numpy.zeros(360,180,3,73,20)
But I get an error saying that the dimensions are too large? Is there a
memory issue here?
So, my workaround is this:
numpoint = 73
datagrid = numpy.zeros(360,180,3,73,1)
for np in range(numpoint):
datagrid[:,:,:,np,0] = datagrid[:,:,:,np,0] + concgrid[:,:,:,np,0]
But this is SLOW.. what can I do to increase efficiency here? Is there a way
to create the larger array? The program loops through several days actually,
filling the 5th dimension. Eventually I just sum the 5th dimension anyway
(as done in the loop of the workaround).
Thanks!
john
--
Configuration
``````````````````````````
Plone 2.5.3-final,
CMF-1.6.4,
Zope (Zope 2.9.7-final, python 2.4.4, linux2),
Five 1.4.1,
Python 2.4.4 (#1, Jul 3 2007, 22:58:17) [GCC 4.1.1 20070105 (Red Hat
4.1.1-51)],
PIL 1.1.6
Mailman 2.1.9
Postfix 2.4.5
Procmail v3.22 2001/09/10
--
View this message in context: http://www.nabble.com/More-efficient-array-processing-tp20136676p20136676.html
Sent from the Python - python-list mailing list archive at Nabble.com.
More information about the Python-list
mailing list