Python 2.7: _PyLong_NumBits() Segfault
ned at nedbatchelder.com
Sat Aug 15 13:19:35 CEST 2015
On Saturday, August 15, 2015 at 2:36:44 AM UTC-4, Adam Meily wrote:
> I am working on a CPython library that serializes Python objects to disk in a custom format. I'm using _PyLong_NumBits() to determine whether PyLong_AsLong(), PyLong_AsUnsignedLong(), PyLong_AsLongLong(), or PyLong_AsUnsignedLongLong() should be used.
> In Python 3.x, I'm able to determine how many bits are required to write an int to disk without issue. However, in Python 2.7, _PyLong_NumBits() segfaults when the type is a PyInt that has the value of 0xffffffff (maximum unsigned 32-bit value). Additionally, _PyLong_NumBits() is not accurate when the number is negative. My guess is that I can't use _PyLong_NumBits() on a PyInt object. So, what is the correct way to determine the number of bits required to represent a PyInt object in Python 2.7?
> I'm using Python 2.7.6, x86-64 on Ubuntu.
You are right that _PyLong_NumBits is only usable with long objects. PyInt
has PyInt_GetMax, which will tell you the maximum size for an int. All ints
will fit in that number of bits, so you can use that size for any int.
Keep in mind that _PyLong_NumBits is not a documented and supported API, so
you'll have to be very careful using it.
Also, you might consider doing work in Python that doesn't absolutely have
to be done in C. It's much easier working in Python than in C.
More information about the Python-list