[Numpy-discussion] Is there a pure numpy recipe for this?

Slaunger Slaunger at gmail.com
Thu Mar 27 03:02:38 EDT 2014


Chris Barker - NOAA Federal wrote
> note that  numpy arrays are not re-sizable, so np.append() and np.insert()
> have to make a new array, and copy all the old data over. If you are
> appending one at a time, this can be pretty darn slow.
> 
> I wrote a "grow_array" class once, it was a wrapper around a numpy array
> that pre-allocated extra data to make appending more efficient. It's kind
> of half-baked code now, but let me know if you are interested.

Hi Chris,

Yes, it is a good point and I am aware of it. For some of these functions it
would have been nice if i could have parsed a preallocated, properly sliced
array to the functions, which i could then reuse in each iteration step. 

It is indeed the memory allocation which appear to take more time than the
actual calculations. 

Still it is much faster to create a few arrays than to loop through a
thousand individual elements in pure Python.

Interesting with the grow_array class. I think that what I have for now is
sufficient, but i will keep your offer in mind:)

--Slaunger



--
View this message in context: http://numpy-discussion.10968.n7.nabble.com/Is-there-a-pure-numpy-recipe-for-this-tp37077p37102.html
Sent from the Numpy-discussion mailing list archive at Nabble.com.



More information about the NumPy-Discussion mailing list