[Numpy-discussion] Unhelpful errors trying to create very large arrays?
Matthew Brett
matthew.brett at gmail.com
Sun Mar 22 02:08:06 EDT 2009
Hi,
>> I found this a little confusing:
>>
>> In [11]: n = 2500000000
>>
>> In [12]: np.arange(n).shape
>> Out[12]: (0,)
>>
>> Maybe this should raise an error instead.
>>
>> This was a little more obvious, but perhaps again a more explicit
>> error would be helpful?
>>
>> In [13]: np.zeros((n,))
>>
>> ---------------------------------------------------------------------------
>> OverflowError Traceback (most recent call
>> last)
>>
>> /home/mb312/tmp/max_speed.py in <module>()
>> ----> 1
>> 2
>> 3
>> 4
>> 5
>>
>> OverflowError: long int too large to convert to int
>
> Open a ticket. For testing purposes, such large integers are easier to parse
> if they are written as products, i.e., something like n = 25*10**8. That is
> about 10 GB for an integer array. How much memory does your machine have?
The machine has got 2GB.
I notice this gives much more helpful memory errors on a 64 bit
machine with 4GB of memory.
I will open a ticket,
Thanks,
Matthew
More information about the NumPy-Discussion
mailing list