<div dir="ltr"><div>Another thing to point out about having an array of that percentage of the available memory is that it severely restricts what you can do with it. Since you are above 50% of the available memory, you won't be able to create another array that would be the result of computing something with that array. So, you are restricted to querying (which you could do without having everything in-memory), or in-place operations.</div><div><br></div><div>Dask arrays might be what you are really looking for.</div><div><br></div><div>Ben Root<br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, Mar 24, 2020 at 2:18 PM Sebastian Berg <<a href="mailto:sebastian@sipsolutions.net">sebastian@sipsolutions.net</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">On Tue, 2020-03-24 at 13:59 -0400, Keyvis Damptey wrote:<br>
> Hi Numpy dev community,<br>
> <br>
> I'm keyvis, a statistical data scientist.<br>
> <br>
> I'm currently using numpy in python 3.8.2 64-bit for a clustering<br>
> problem,<br>
> on a machine with 1.9 TB RAM. When I try using np.zeros to create a<br>
> 600,000<br>
> by 600,000 matrix of dtype=np.float32 it says<br>
> "Unable to allocate 1.31 TiB for an array with shape (600000, 600000)<br>
> and<br>
> <br>
> data type float32"<br>
> <br>
<br>
If this error happens, allocating the memory failed. This should be<br>
pretty much a simple `malloc` call in C, so this is the kernel<br>
complaining, not Python/NumPy.<br>
<br>
I am not quite sure, but maybe memory fragmentation plays its part, or<br>
simply are actually out of memory for that process, 1.44TB is a<br>
significant portion of the total memory after all.<br>
<br>
Not sure what to say, but I think you should probably look into other<br>
solutions, maybe using HDF5, zarr, or memory-mapping (although I am not<br>
sure the last actually helps). It will be tricky to work with arrays of<br>
a size that is close to the available total memory.<br>
<br>
Maybe someone who works more with such data here can give you tips on<br>
what projects can help you or what solutions to look into.<br>
<br>
- Sebastian<br>
<br>
<br>
<br>
> I used psutils to determine how much RAM python thinks it has access<br>
> to and<br>
> it return with 1.8 TB approx.<br>
> <br>
> Is there some way I can fix numpy to create these large arrays?<br>
> Thanks for your time and consideration<br>
> _______________________________________________<br>
> NumPy-Discussion mailing list<br>
> <a href="mailto:NumPy-Discussion@python.org" target="_blank">NumPy-Discussion@python.org</a><br>
> <a href="https://mail.python.org/mailman/listinfo/numpy-discussion" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/numpy-discussion</a><br>
<br>
_______________________________________________<br>
NumPy-Discussion mailing list<br>
<a href="mailto:NumPy-Discussion@python.org" target="_blank">NumPy-Discussion@python.org</a><br>
<a href="https://mail.python.org/mailman/listinfo/numpy-discussion" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/numpy-discussion</a><br>
</blockquote></div>