Help  numpy / scipy binary compatibility
Hi, I would be very happy of some help trying to work out a numpy package binary incompatibility. I'm trying to work out what's happening for this ticket: https://github.com/scipy/scipy/issues/3863 which I summarized at the end: https://github.com/scipy/scipy/issues/3863#issuecomment51669861 but basically, we're getting these errors: RuntimeWarning: numpy.dtype size changed, may indicate binary incompatibility I now realize I am lost in the world of numpy / scipy etc binary compatibility, I'd really like some advice. In this case numpy == 1.8.1 scipy == 0.14.0  compiled against numpy 1.5.1 scikitlearn == 0.15.1 compiled against numpy 1.6.0 Can y'all see any potential problem with those dependencies in binary builds? The relevant scipy Cython c files seem to guard against raising this error by doing notstrict checks of the e.g. numpy dtype, so I am confused how these errors come about. Can anyone give any pointers? Cheers, Matthew
On Sat, Aug 9, 2014 at 10:41 AM, Matthew Brett
Hi,
I would be very happy of some help trying to work out a numpy package binary incompatibility.
I'm trying to work out what's happening for this ticket:
https://github.com/scipy/scipy/issues/3863
which I summarized at the end:
https://github.com/scipy/scipy/issues/3863#issuecomment51669861
but basically, we're getting these errors:
RuntimeWarning: numpy.dtype size changed, may indicate binary incompatibility
I now realize I am lost in the world of numpy / scipy etc binary compatibility, I'd really like some advice. In this case
numpy == 1.8.1 scipy == 0.14.0  compiled against numpy 1.5.1 scikitlearn == 0.15.1 compiled against numpy 1.6.0
Can y'all see any potential problem with those dependencies in binary builds?
The relevant scipy Cython c files seem to guard against raising this error by doing notstrict checks of the e.g. numpy dtype, so I am confused how these errors come about. Can anyone give any pointers?
Assuming the message is not bogus, I would try import von_mises with a venv containing numpy 1.5.1, then 1.6.0, etc... to detect when the change happened. David
Cheers,
Matthew _______________________________________________ NumPyDiscussion mailing list NumPyDiscussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpydiscussion
On Sat, Aug 9, 2014 at 5:04 AM, David Cournapeau
On Sat, Aug 9, 2014 at 10:41 AM, Matthew Brett
wrote: Hi,
I would be very happy of some help trying to work out a numpy package binary incompatibility.
I'm trying to work out what's happening for this ticket:
https://github.com/scipy/scipy/issues/3863
which I summarized at the end:
https://github.com/scipy/scipy/issues/3863#issuecomment51669861
but basically, we're getting these errors:
RuntimeWarning: numpy.dtype size changed, may indicate binary incompatibility
I now realize I am lost in the world of numpy / scipy etc binary compatibility, I'd really like some advice. In this case
numpy == 1.8.1 scipy == 0.14.0  compiled against numpy 1.5.1 scikitlearn == 0.15.1 compiled against numpy 1.6.0
Can y'all see any potential problem with those dependencies in binary builds?
The relevant scipy Cython c files seem to guard against raising this error by doing notstrict checks of the e.g. numpy dtype, so I am confused how these errors come about. Can anyone give any pointers?
Assuming the message is not bogus, I would try import von_mises with a venv containing numpy 1.5.1, then 1.6.0, etc... to detect when the change happened.
That should be a recent change in 1.8.1, either because the dtype size did actually change or because the silencing of this message in NoseTester (numpy.testing) is not effective anymore. Note that the warning is too agressive, because it triggers both on ABI breaks and on backwardscompatible extensions to dtype. That's why it's filtered in numpy.testing. This was reported to Cython a while ago, not sure they've fixed the issue in the meantime or not. Ralf
On Mon, Aug 11, 2014 at 8:39 AM, Ralf Gommers
On Sat, Aug 9, 2014 at 5:04 AM, David Cournapeau
wrote: On Sat, Aug 9, 2014 at 10:41 AM, Matthew Brett
wrote: Hi,
I would be very happy of some help trying to work out a numpy package binary incompatibility.
I'm trying to work out what's happening for this ticket:
https://github.com/scipy/scipy/issues/3863
which I summarized at the end:
https://github.com/scipy/scipy/issues/3863#issuecomment51669861
but basically, we're getting these errors:
RuntimeWarning: numpy.dtype size changed, may indicate binary incompatibility
I now realize I am lost in the world of numpy / scipy etc binary compatibility, I'd really like some advice. In this case
numpy == 1.8.1 scipy == 0.14.0  compiled against numpy 1.5.1 scikitlearn == 0.15.1 compiled against numpy 1.6.0
Can y'all see any potential problem with those dependencies in binary builds?
The relevant scipy Cython c files seem to guard against raising this error by doing notstrict checks of the e.g. numpy dtype, so I am confused how these errors come about. Can anyone give any pointers?
Assuming the message is not bogus, I would try import von_mises with a venv containing numpy 1.5.1, then 1.6.0, etc... to detect when the change happened.
That should be a recent change in 1.8.1, either because the dtype size did actually change or because the silencing of this message in NoseTester (numpy.testing) is not effective anymore.
Note that the warning is too agressive, because it triggers both on ABI breaks and on backwardscompatible extensions to dtype. That's why it's filtered in numpy.testing. This was reported to Cython a while ago, not sure they've fixed the issue in the meantime or not.
Never mind, saw that you already figured out that it's due to scikitlearn: https://github.com/scipy/scipy/issues/3863 Ralf
Hi,
On Sun, Aug 10, 2014 at 11:41 PM, Ralf Gommers
On Mon, Aug 11, 2014 at 8:39 AM, Ralf Gommers
wrote: On Sat, Aug 9, 2014 at 5:04 AM, David Cournapeau
wrote: On Sat, Aug 9, 2014 at 10:41 AM, Matthew Brett
wrote: Hi,
I would be very happy of some help trying to work out a numpy package binary incompatibility.
I'm trying to work out what's happening for this ticket:
https://github.com/scipy/scipy/issues/3863
which I summarized at the end:
https://github.com/scipy/scipy/issues/3863#issuecomment51669861
but basically, we're getting these errors:
RuntimeWarning: numpy.dtype size changed, may indicate binary incompatibility
I now realize I am lost in the world of numpy / scipy etc binary compatibility, I'd really like some advice. In this case
numpy == 1.8.1 scipy == 0.14.0  compiled against numpy 1.5.1 scikitlearn == 0.15.1 compiled against numpy 1.6.0
Can y'all see any potential problem with those dependencies in binary builds?
The relevant scipy Cython c files seem to guard against raising this error by doing notstrict checks of the e.g. numpy dtype, so I am confused how these errors come about. Can anyone give any pointers?
Assuming the message is not bogus, I would try import von_mises with a venv containing numpy 1.5.1, then 1.6.0, etc... to detect when the change happened.
That should be a recent change in 1.8.1, either because the dtype size did actually change or because the silencing of this message in NoseTester (numpy.testing) is not effective anymore.
Note that the warning is too agressive, because it triggers both on ABI breaks and on backwardscompatible extensions to dtype. That's why it's filtered in numpy.testing. This was reported to Cython a while ago, not sure they've fixed the issue in the meantime or not.
Never mind, saw that you already figured out that it's due to scikitlearn: https://github.com/scipy/scipy/issues/3863
Yes, sorry, I should have reported back to the list  the problem was that sklearn is removing all the warnings filters that numpy installs. For reference, no, Cython left the warnings are they were. As you implied, Cython raises these warnings (if not filtered by numpy) if the various numpy C structs have increased size since compile time, including numpy dtype ufuncs and ndarrays. The structs get bigger when we add to the entries in the struct; this is backwards compatible but not forwards compatible. Because I got very confused I wrote this little piece of code to show current compiled and inmemory sizes of dtypes and ufuncs: https://github.com/matthewbrett/npsizes Giving (result of ./npreport): Numpy version: 1.5.1 dtype: static size 80; memory size 80 ndarray: static size 80; memory size 80 ufunc: static size 144; memory size 144 Numpy version: 1.6.0 dtype: static size 80; memory size 80 ndarray: static size 80; memory size 80 ufunc: static size 144; memory size 144 Numpy version: 1.7.1 dtype: static size 88; memory size 88 ndarray: static size 80; memory size 80 ufunc: static size 176; memory size 176 Numpy version: 1.8.1 dtype: static size 88; memory size 88 ndarray: static size 80; memory size 80 ufunc: static size 192; memory size 192 on OSX.... Cheers, Matthew
On 11.08.2014 08:53, Matthew Brett wrote:
Hi,
On Sun, Aug 10, 2014 at 11:41 PM, Ralf Gommers
wrote: On Mon, Aug 11, 2014 at 8:39 AM, Ralf Gommers
wrote: On Sat, Aug 9, 2014 at 5:04 AM, David Cournapeau
wrote: On Sat, Aug 9, 2014 at 10:41 AM, Matthew Brett
wrote: Hi,
I would be very happy of some help trying to work out a numpy package binary incompatibility.
I'm trying to work out what's happening for this ticket:
https://github.com/scipy/scipy/issues/3863
which I summarized at the end:
https://github.com/scipy/scipy/issues/3863#issuecomment51669861
but basically, we're getting these errors:
RuntimeWarning: numpy.dtype size changed, may indicate binary incompatibility
I now realize I am lost in the world of numpy / scipy etc binary compatibility, I'd really like some advice. In this case
numpy == 1.8.1 scipy == 0.14.0  compiled against numpy 1.5.1 scikitlearn == 0.15.1 compiled against numpy 1.6.0
Can y'all see any potential problem with those dependencies in binary builds?
The relevant scipy Cython c files seem to guard against raising this error by doing notstrict checks of the e.g. numpy dtype, so I am confused how these errors come about. Can anyone give any pointers?
Assuming the message is not bogus, I would try import von_mises with a venv containing numpy 1.5.1, then 1.6.0, etc... to detect when the change happened.
That should be a recent change in 1.8.1, either because the dtype size did actually change or because the silencing of this message in NoseTester (numpy.testing) is not effective anymore.
Note that the warning is too agressive, because it triggers both on ABI breaks and on backwardscompatible extensions to dtype. That's why it's filtered in numpy.testing. This was reported to Cython a while ago, not sure they've fixed the issue in the meantime or not.
Never mind, saw that you already figured out that it's due to scikitlearn: https://github.com/scipy/scipy/issues/3863
Yes, sorry, I should have reported back to the list  the problem was that sklearn is removing all the warnings filters that numpy installs.
For reference, no, Cython left the warnings are they were.
As you implied, Cython raises these warnings (if not filtered by numpy) if the various numpy C structs have increased size since compile time, including numpy dtype ufuncs and ndarrays. The structs get bigger when we add to the entries in the struct; this is backwards compatible but not forwards compatible.
should we deprecate use of the ufunc and dtype structures? Or are the internals of them used too much outside of numpy? I am thinking about changing the ufunc size yet again for 1.10, and it already has far too many members third parties should probably have even seen.
participants (4)

David Cournapeau

Julian Taylor

Matthew Brett

Ralf Gommers