[Numpy-discussion] At.: use less RAM memory and increase execution speed
msarahan at gmail.com
Thu Sep 26 14:40:39 EDT 2013
xrange should be more memory efficient than range:
Replacing arrays with lists is probably a bad idea for a lot of reasons.
You'll lose nice vectorization of simple operations, and all of numpy's
other benefits. To be more parsimonious with memory, you probably want to
pay close attention to array data types and make sure that things aren't
being automatically upconverted to higher precision data types.
I've never considered using reset. Perhaps you need to take a look at your
program's structure and make sure that useless arrays can be garbage
Preallocation of arrays can give you tons of benefits with regards to array
size and program speed. If you aren't using preallocation, now's a great
time to start.
You can pass numpy arrays into Cython functions, and you can also call
numpy/scipy functions within Cython functions. Identify your bottlenecks
using some kind of profiling, then work on optimizing those with Cython.
On Thu, Sep 26, 2013 at 11:19 AM, Josè Luis Mietta <
joseluismietta at yahoo.com.ar> wrote:
> Hi experts!
> I wanna use less RAM memory in my Monte Carlo simulations. In my
> algorithm I use numpy arrays and xrange() function.
> I hear that I can reduce RAM used in my lagorithm if I do the next:
> 1) replace xrange() for range().
> 2) replace numpya arrays for python lists
> 3) use reset() function for deleting useless arrays.
> Is that true?
> In adition, I wanna increase execution speed of my code (I use numpy and
> SciPy functions). How can I apply Cython? Will it help?
> Please help.
> Thanks a lot!!
> NumPy-Discussion mailing list
> NumPy-Discussion at scipy.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the NumPy-Discussion