[Numpy-discussion] speeding up an array operation

Mag Gam magawake at gmail.com
Thu Jul 9 01:12:16 EDT 2009


Hey All

I am reading thru a file and trying to store the values into another
array, but instead of storing the values 1 by 1, I would like to store
them in bulk sets for optimization purposes.

Here is what I have, which does it 1x1:

z={}  #dictionary
r=csv.reader(file)
for i,row in enumerate(r):
  p="/MIT/"+row[1]

  if p not in z:
    z[p]=0:
  else:
    z[p]+=1

  arr[p]['chem'][z[p]]=tuple(row) #this loads the array 1 x 1


I would like to avoid the 1x1 loading, instead I would like to bulk
load the array. Lets say load up 5million lines into memory and then
push into array. Any ideas on how to do that?

TIA



More information about the NumPy-Discussion mailing list