Suppose I would like to take advantage of some functions from MKL in numpy C source code, which would require to use
#include "mkl.h"
Ideally this include line must not break the build of numpy when MKL is not present, so my initial approach was to use
#if defined(SCIPY_MKL_H) #include "mkl.h" #endif
Unfortunately, this did not work when building with gcc on a machine where MKL is present on default LD_LIBRARY_PATH, because then the distutils code was setting SCIPY_MKL_H preprocessor variable, even though mkl headers are not on the C_INCLUDE_PATH.
What is the preferred solution to include an external library header to ensure that codebase continues to build in most common cases?
One approach I can think of is to set a preprocessor variable, say HAVE_MKL_HEADERS in numpy/core/includes/numpy/config.h depending on an outcome of building of a simple _configtest.c using config.try_compile(), like it is done in numpy/core/setup.py
Is there a simpler, or a better way?
Thank you, Oleksandr
On 09/27/2016 11:09 PM, Pavlyk, Oleksandr wrote:
Suppose I would like to take advantage of some functions from MKL in numpy C source code, which would require to use
#include “mkl.h”
Ideally this include line must not break the build of numpy when MKL is not present, so my initial approach was to use
#if defined(SCIPY_MKL_H)
#include “mkl.h”
#endif
Unfortunately, this did not work when building with gcc on a machine where MKL is present on default LD_LIBRARY_PATH, because then the distutils code was setting SCIPY_MKL_H preprocessor variable, even though mkl headers are not on the C_INCLUDE_PATH.
What is the preferred solution to include an external library header to ensure that codebase continues to build in most common cases?
One approach I can think of is to set a preprocessor variable, say HAVE_MKL_HEADERS in numpy/core/includes/numpy/config.h depending on an outcome of building of a simple _configtest.c using config.try_compile(), like it is done in numpy/core/setup.py //
/ /
Is there a simpler, or a better way?
hi, you could put the header into OPTIONAL_HEADERS in numpy/core/setup_common.py. This will define HAVE_HEADERFILENAME_H for you but this will not check that the corresponding the library actually exists and can be linked. For that SCIPY_MKL_H is probably the right macro, though its name is confusing as it does not check for the header presence ...
Can you tell us more about what from mkl you are attempting to add and for what purpos, e.g. is it something that should go into numpy proper or just for personal/internal use?
cheers, Julian
Hi Julian,
Thank you very much for the response. It appears to work.
I work on "Intel Distribution for Python" at Intel Corp. This question was motivated by work needed to prepare pull requests with our changes/optimizations to numpy source code. In particular, the numpy.random_intel package
https://mail.scipy.org/pipermail/numpydiscussion/2016June/075693.html
relies on MKL, but its potential inclusion in numpy should not break the build if MKL is unavailable.
Also our benchmarking was pointing at Numpy's sequential memory copying as a bottleneck. I am working to open a pull request into the main trunk of numpy to take advantage of multithreaded MKL's BLAS dcopy function to do memory copying in parallel for sufficiently large sizes.
Related to numpy.random_inter, I noticed that the randomstate package, which extends numpy.random was not being made a part of numpy, but rather published on PyPI as a standalone module. Does that mean that the community decided against including it in numpy's codebase? If so, I would appreciate if someone could elaborate on or point me to the reasoning behind that decision.
Thank you, Oleksandr
Original Message From: NumPyDiscussion [mailto:numpydiscussionbounces@scipy.org] On Behalf Of Julian Taylor Sent: Thursday, September 29, 2016 8:10 AM To: numpydiscussion@scipy.org Subject: Re: [Numpydiscussion] Using libraryspecific headers
On 09/27/2016 11:09 PM, Pavlyk, Oleksandr wrote:
Suppose I would like to take advantage of some functions from MKL in numpy C source code, which would require to use
#include "mkl.h"
Ideally this include line must not break the build of numpy when MKL is not present, so my initial approach was to use
#if defined(SCIPY_MKL_H)
#include "mkl.h"
#endif
Unfortunately, this did not work when building with gcc on a machine where MKL is present on default LD_LIBRARY_PATH, because then the distutils code was setting SCIPY_MKL_H preprocessor variable, even though mkl headers are not on the C_INCLUDE_PATH.
What is the preferred solution to include an external library header to ensure that codebase continues to build in most common cases?
One approach I can think of is to set a preprocessor variable, say HAVE_MKL_HEADERS in numpy/core/includes/numpy/config.h depending on an outcome of building of a simple _configtest.c using config.try_compile(), like it is done in numpy/core/setup.py //
/ /
Is there a simpler, or a better way?
hi, you could put the header into OPTIONAL_HEADERS in numpy/core/setup_common.py. This will define HAVE_HEADERFILENAME_H for you but this will not check that the corresponding the library actually exists and can be linked. For that SCIPY_MKL_H is probably the right macro, though its name is confusing as it does not check for the header presence ...
Can you tell us more about what from mkl you are attempting to add and for what purpos, e.g. is it something that should go into numpy proper or just for personal/internal use?
cheers, Julian
_______________________________________________ NumPyDiscussion mailing list NumPyDiscussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpydiscussion
On Thu, Sep 29, 2016 at 6:27 PM, Pavlyk, Oleksandr < oleksandr.pavlyk@intel.com> wrote:
Related to numpy.random_inter, I noticed that the randomstate package,
which extends numpy.random was
not being made a part of numpy, but rather published on PyPI as a
standalone module. Does that mean that
the community decided against including it in numpy's codebase? If so, I
would appreciate if someone could
elaborate on or point me to the reasoning behind that decision.
No, we are just working out the API and the extensibility machinery in a separate package before committing to backwards compatibility.
 Robert Kern
Pavlyk,
NumExpr optionally includes MKL's VML at compiletime. You may want to look at its implementation. From what I recall it relies on a function in a bootstrapped __config__.py to determine if MKL is present.
Robert
On Thu, Sep 29, 2016 at 7:27 PM, Pavlyk, Oleksandr < oleksandr.pavlyk@intel.com> wrote:
Hi Julian,
Thank you very much for the response. It appears to work.
I work on "Intel Distribution for Python" at Intel Corp. This question was motivated by work needed to prepare pull requests with our changes/optimizations to numpy source code. In particular, the numpy.random_intel package
https://mail.scipy.org/pipermail/numpydiscussion/2016June/075693.html
relies on MKL, but its potential inclusion in numpy should not break the build if MKL is unavailable.
Also our benchmarking was pointing at Numpy's sequential memory copying as a bottleneck. I am working to open a pull request into the main trunk of numpy to take advantage of multithreaded MKL's BLAS dcopy function to do memory copying in parallel for sufficiently large sizes.
Related to numpy.random_inter, I noticed that the randomstate package, which extends numpy.random was not being made a part of numpy, but rather published on PyPI as a standalone module. Does that mean that the community decided against including it in numpy's codebase? If so, I would appreciate if someone could elaborate on or point me to the reasoning behind that decision.
Thank you, Oleksandr
Original Message From: NumPyDiscussion [mailto:numpydiscussionbounces@scipy.org] On Behalf Of Julian Taylor Sent: Thursday, September 29, 2016 8:10 AM To: numpydiscussion@scipy.org Subject: Re: [Numpydiscussion] Using libraryspecific headers
On 09/27/2016 11:09 PM, Pavlyk, Oleksandr wrote:
Suppose I would like to take advantage of some functions from MKL in numpy C source code, which would require to use
#include "mkl.h"
Ideally this include line must not break the build of numpy when MKL is not present, so my initial approach was to use
#if defined(SCIPY_MKL_H)
#include "mkl.h"
#endif
Unfortunately, this did not work when building with gcc on a machine where MKL is present on default LD_LIBRARY_PATH, because then the distutils code was setting SCIPY_MKL_H preprocessor variable, even though mkl headers are not on the C_INCLUDE_PATH.
What is the preferred solution to include an external library header to ensure that codebase continues to build in most common cases?
One approach I can think of is to set a preprocessor variable, say HAVE_MKL_HEADERS in numpy/core/includes/numpy/config.h depending on an outcome of building of a simple _configtest.c using config.try_compile(), like it is done in numpy/core/setup.py //
/ /
Is there a simpler, or a better way?
hi, you could put the header into OPTIONAL_HEADERS in numpy/core/setup_common.py. This will define HAVE_HEADERFILENAME_H for you but this will not check that the corresponding the library actually exists and can be linked. For that SCIPY_MKL_H is probably the right macro, though its name is confusing as it does not check for the header presence ...
Can you tell us more about what from mkl you are attempting to add and for what purpos, e.g. is it something that should go into numpy proper or just for personal/internal use?
cheers, Julian
NumPyDiscussion mailing list NumPyDiscussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpydiscussion _______________________________________________ NumPyDiscussion mailing list NumPyDiscussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpydiscussion
participants (4)

Julian Taylor

Pavlyk, Oleksandr

Robert Kern

Robert McLeod