[Numpy-discussion] Help - numpy / scipy binary compatibility

Julian Taylor jtaylor.debian at googlemail.com
Mon Aug 11 03:30:30 EDT 2014


On 11.08.2014 08:53, Matthew Brett wrote:
> Hi,
> 
> On Sun, Aug 10, 2014 at 11:41 PM, Ralf Gommers <ralf.gommers at gmail.com> wrote:
>>
>>
>>
>> On Mon, Aug 11, 2014 at 8:39 AM, Ralf Gommers <ralf.gommers at gmail.com>
>> wrote:
>>>
>>>
>>>
>>>
>>> On Sat, Aug 9, 2014 at 5:04 AM, David Cournapeau <cournape at gmail.com>
>>> wrote:
>>>>
>>>>
>>>>
>>>>
>>>> On Sat, Aug 9, 2014 at 10:41 AM, Matthew Brett <matthew.brett at gmail.com>
>>>> wrote:
>>>>>
>>>>> Hi,
>>>>>
>>>>> I would be very happy of some help trying to work out a numpy package
>>>>> binary incompatibility.
>>>>>
>>>>> I'm trying to work out what's happening for this ticket:
>>>>>
>>>>> https://github.com/scipy/scipy/issues/3863
>>>>>
>>>>> which I summarized at the end:
>>>>>
>>>>> https://github.com/scipy/scipy/issues/3863#issuecomment-51669861
>>>>>
>>>>> but basically, we're getting these errors:
>>>>>
>>>>> RuntimeWarning: numpy.dtype size changed, may indicate binary
>>>>> incompatibility
>>>>>
>>>>> I now realize I am lost in the world of numpy / scipy etc binary
>>>>> compatibility, I'd really like some advice.    In this case
>>>>>
>>>>> numpy == 1.8.1
>>>>> scipy == 0.14.0 - compiled against numpy 1.5.1
>>>>> scikit-learn == 0.15.1 compiled against numpy 1.6.0
>>>>>
>>>>> Can y'all see any potential problem with those dependencies in binary
>>>>> builds?
>>>>>
>>>>> The relevant scipy Cython c files seem to guard against raising this
>>>>> error by doing not-strict checks of the e.g. numpy dtype, so I am
>>>>> confused how these errors come about.  Can anyone give any pointers?
>>>>
>>>>
>>>> Assuming the message is not bogus, I would try import von_mises with a
>>>> venv containing numpy 1.5.1, then 1.6.0, etc... to detect when the change
>>>> happened.
>>>
>>>
>>> That should be a recent change in 1.8.1, either because the dtype size did
>>> actually change or because the silencing of this message in NoseTester
>>> (numpy.testing) is not effective anymore.
>>>
>>> Note that the warning is too agressive, because it triggers both on ABI
>>> breaks and on backwards-compatible extensions to dtype. That's why it's
>>> filtered in numpy.testing. This was reported to Cython a while ago, not sure
>>> they've fixed the issue in the meantime or not.
>>
>>
>> Never mind, saw that you already figured out that it's due to scikit-learn:
>> https://github.com/scipy/scipy/issues/3863
> 
> Yes, sorry, I should have reported back to the list - the problem was
> that sklearn is removing all the warnings filters that numpy installs.
> 
> For reference, no, Cython left the warnings are they were.
> 
> As you implied, Cython raises these warnings (if not filtered by
> numpy) if the various numpy C structs have increased size since
> compile time, including numpy dtype ufuncs and ndarrays.  The structs
> get bigger when we add to the entries in the struct; this is backwards
> compatible but not forwards compatible.

should we deprecate use of the ufunc and dtype structures?
Or are the internals of them used too much outside of numpy?

I am thinking about changing the ufunc size yet again for 1.10, and it
already has far too many members third parties should probably have even
seen.



More information about the NumPy-Discussion mailing list