[Numpy-discussion] Crash on failed memory allocation

Albert Strasheim fullung at gmail.com
Tue May 2 16:28:01 EDT 2006


Hello all

Stefan van der Walt and I have discovered two bugs when working with large
blocks of memory and array descriptors.

Example code that causes problems:

import numpy as N
print N.__version__
x=[]
i = 20000
j = 10000
names = ['a', 'b']
formats = ['f8', 'f8']
descr = N.dtype({'names' : names, 'formats' : formats})
for y in range(20000):
    x.append(N.empty((10000,),dtype=descr)['a'])
N.asarray(x)

With i and j large and a big descriptor, you run out of process address
space (typically 2 GB?) during the list append. This raises a MemoryError.

However, with a slightly smaller list, you run out of memory in asarray.
This causes a segfault on Linux. This problem can also manifest itself as a
TypeError. On Windows with r2462 I got this message:

Traceback (most recent call last):
  File "numpybug.py", line 7, in ?
    N.asarray(x)
  File "C:\Python24\Lib\site-packages\numpy\core\numeric.py", line 116, in
asarray
    return array(a, dtype, copy=False, order=order)
TypeError: a float is required

Stefan also discovered the following bug:

import numpy as N
descr = N.dtype({'names' : ['a'], 'formats' : ['foo']}, align=1)

Notice the invalid typestring. Combined with align=1 this seem to be
guaranteed to crash.

Regards,

Albert





More information about the NumPy-Discussion mailing list