[Numpy-discussion] numpy slices limited to 32 bit values?

Charles R Harris charlesr.harris at gmail.com
Thu May 14 02:04:31 EDT 2009


On Wed, May 13, 2009 at 10:50 PM, Glenn Tarbox, PhD <glenn at tarbox.org>wrote:

> I'm using the latest version of Sage (3.4.2) which is python 2.5 and numpy
> something or other (I will do more digging presently)
>
> I'm able to map large files and access all the elements unless I'm using
> slices
>
> so, for example:
>
> fp = np.memmap("/mnt/hdd/data/mmap/numpy1e10.mmap", dtype='float64',
> mode='r+', shape=(10000000000,))
>
> which is 1e10 doubles if you don't wanna count the zeros
>
> gives full access to a 75 GB memory image
>
> But when I do:
>
> fp[:] = 1.0
> np.sum(fp)
>
> I get 1410065408.0  as the result
>

As doubles, that is more than 2**33 bytes, so I expect there is something
else going on. How much physical memory/swap memory do you have? This could
also be a python problem since python does the memmap.

Chuck
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/numpy-discussion/attachments/20090514/e946e990/attachment.html>


More information about the NumPy-Discussion mailing list