Tuple size and memory allocation for embedded Python
Tim Peters
tim.peters at gmail.com
Fri Jan 21 17:51:21 EST 2005
[Jinming Xu]
>> Python seems unstable, when allocating big memory. For
>> example, the following C++ code creates a tuple of tuples:
>>
>> PyObject* arCoord = PyTuple_New(n);
>> double d = 1.5;
>> for(int i=0; i<n; i++)
>> {
>> PyObject* coord = PyTuple_New(2);
>> PyTuple_SetItem(coord,0, PyFloat_FromDouble(d));//x
>> PyTuple_SetItem(coord,1, PyFloat_FromDouble(d));//y
>> PyTuple_SetItem(arCoord,i, coord);
>> }
>>
>> When the n is small, say 100, the code works fine. when n
>> is big, say > 10,000, Python has trouble allocating
>> memory, saying:
>>
>> "Exception exceptions.IndexError: 'tuple index out of range'
>> in 'garbage collection' ignored
>> Fatal Python error: unexpected exception during garbage
>> collection
>> Aborted"
[Craig Ringer]
> You're not checking for errors from PyTuple_SetItem.
Or from PyTuple_New(), or from PyFloat_FromDouble(). They can all
fail, and indeed:
> You need to do so, otherwise exceptions will go uncaught
> and may pop up at weird points later in your software's
> execution, or crash things.
That's right. There's no point thinking about this at all before
every C API call is checked for an error return.
BTW, since the error occurred during garbage collection, there's
really no reason to believe that the true cause of the problem is in
the code shown. It could simply be that allocating a whole lot of
tuples here triggers a round of garbage collection, which in turn
reveals an error in code we haven't been shown. In fact, I expect
that's actually the case. The possible errors the OP is ignoring here
are overwhemingly possible failures of malloc() to find more memory,
and in those cases the C API calls shown would return NULL, and a
segfault would be very likely soon after.
More information about the Python-list
mailing list