[Numpy-discussion] Out-of-RAM FFTs

David Cournapeau david at ar.media.kyoto-u.ac.jp
Wed Apr 1 11:26:24 EDT 2009


Greg Novak wrote:
> 1) Numerical Recipes has an out-of-memory FFT algorithm, but looking
> through the numpy and scipy docs and modules, I didn't find a function
> that does the same thing.  Did I miss it? 

I don't think so.

>  Should I get to work typing
> it in?
>   

Maybe :)

> 2) I had high hopes for just memory-mapping the large array and
> passing it to the standard fft function.  However, the memory-mapped
> region must fit into the address space, and I don't seem to be able to
> use more than 2 GB at a time.  So memory mapping doesn't seem to help
> me at all.
>
> This last issue leads to another series of things that puzzle me.  I
> have an iMac running OS X 10.5 with an Intel Core 2 duo processor and
> 4 GB of memory.  As far as I've learned, the processor is 64 bit, the
> operating system is 64 bit, so I should be able to happily memory-map
> my entire disk if I want.  However, Python seems to run out of steam
> when it's used 2 GB.  This is true of both 2.5 and 2.6.  What gives?
> Is this a Python issue?
>   

Yes - official python binaries are 32 bits only. I don't know how
advanced/usable is the 64 bits build, but I am afraid you will have to
use an unofficial build or to build it by yourself.

I don't know if the following can help you:

http://developer.apple.com/documentation/Darwin/Conceptual/64bitPorting/intro/intro.html#//apple_ref/doc/uid/TP40001064-CH205-TPXREF101

cheers,

David



More information about the NumPy-Discussion mailing list