[Numpy-discussion] A memory problem: why does mmap come up in numpy.inner?

Charles R Harris charlesr.harris at gmail.com
Wed Jun 4 22:36:08 EDT 2008


On Wed, Jun 4, 2008 at 7:41 PM, Dan Yamins <dyamins at gmail.com> wrote:

>
>
> On Wed, Jun 4, 2008 at 9:06 PM, Charles R Harris <
> charlesr.harris at gmail.com> wrote:
>
>>
>>
>> On Wed, Jun 4, 2008 at 6:42 PM, Dan Yamins <dyamins at gmail.com> wrote:
>>
>>> I'm using python 2.5.2 on OS X, with 8 GB of ram, and a 64-bit
>>> processor.  In
>>> this, setting, I'm working with large arrays of binary data.  E.g, I want
>>> to
>>> make calls like:
>>>                Z = numpy.inner(a,b)
>>> where and b are fairly large  -- e.g. 20000 rows by 100 columns.
>>>
>>> However, when such a call is made, I get a memory error that I don't
>>> understand.
>>> Specifically:
>>>
>>> >>> s = numpy.random.binomial(1,.5,(20000,100))   #creates 20000x100 bin.
>>> array
>>> >>> r = numpy.inner(s,s)
>>> Python(1714) malloc: *** mmap(size=1600000000) failed (error code=12)
>>> *** error: can't allocate region
>>> *** set a breakpoint in malloc_error_break to debug
>>>
>>>
>>>
>> Are both python and your version of OS X fully 64 bits?
>>
>
>
> I'm not sure.   My version of OS X is the most recent version, the one that
> ships with a new MacPro Dual Quad-core Xeon 3.2MHz chipset.  The processor
> is definitely 64-bit, so I think the operating system probably is enable for
> that, but am not sure. (How would I find out?)  As for the python version, I
> thought that 2.5 and above were 64-enabled, but I'm not sure how I'd check
> it.
>

Try

In [3]: numpy.dtype(numpy.uintp).itemsize
Out[3]: 4

which is the size in bytes of the integer needed to hold a pointer. The
output above is for 32 bit python/numpy.

Chuck
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/numpy-discussion/attachments/20080604/2a68d5ee/attachment.html>


More information about the NumPy-Discussion mailing list