Hi all! I am currently developing a python module in C/C++ which is supposed to access nd arrays as defined by the following command in python a = numpy.array([1,1,1]) I want to access the array the following way and use the nd array data for further processing in C. mymod.doSthg(a) The example code on http://numpy.sourceforge.net/numdoc/HTML/numdoc.htm#pgfId-36721 (!PyArg_ParseTuple(args, "O!",&PyArray_Type, &array)) does not work for nd arrays. I always get TypeError: argument 1 must be array, not numpy.ndarray I assume the error is the constant provided as the third parameter, saying that the imput is of PyArray_Type and no nd array. So here are my questions: 1. is there any good tutorial / example code for acessing nd arrays in C? 2. what is the difference between both (arrays and nd arrays? - I am new to python and heard there were different approaches to arrays and that nd arrays work better for multi dimensional applications. Is this right?) 3. which one of both will be used in the future? Thank you in advance for your help, Thomas
On http://www.scipy.org/JorisDeRidder I've just put an example how I passed multidimensional Numpy arrays to C++ using ctypes. Perhaps it's helpful for your application. I didn't put it in the cookbook yet, because I would first like to test it a bit more. Up to now I didn't experience any bugs though. Joris On 22 Apr 2008, at 23:38, Thomas Hrabe wrote:
Hi all!
I am currently developing a python module in C/C++ which is supposed to access nd arrays as defined by the following command in python
a = numpy.array([1,1,1])
I want to access the array the following way and use the nd array data for further processing in C.
mymod.doSthg(a)
The example code on http://numpy.sourceforge.net/numdoc/HTML/numdoc.htm#pgfId-36721
(!PyArg_ParseTuple(args, "O!",&PyArray_Type, &array))
does not work for nd arrays. I always get TypeError: argument 1 must be array, not numpy.ndarray
I assume the error is the constant provided as the third parameter, saying that the imput is of PyArray_Type and no nd array.
So here are my questions: 1. is there any good tutorial / example code for acessing nd arrays in C? 2. what is the difference between both (arrays and nd arrays? - I am new to python and heard there were different approaches to arrays and that nd arrays work better for multi dimensional applications. Is this right?) 3. which one of both will be used in the future?
Thank you in advance for your help, Thomas
_______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Disclaimer: http://www.kuleuven.be/cwis/email_disclaimer.htm
On Wed, Apr 23, 2008 at 3:56 AM, Joris De Ridder <Joris.DeRidder@ster.kuleuven.be> wrote:
On http://www.scipy.org/JorisDeRidder I've just put an example how I passed multidimensional Numpy arrays to C++ using ctypes. Perhaps it's helpful for your application. I didn't put it in the cookbook yet, because I would first like to test it a bit more. Up to now I didn't experience any bugs though.
Joris
Hi Joris, this is a great ( short ) recipe ! Could you elaborate on the line "You need to compile myextension.cpp and make a shared library from it. The easiest way is to use Scons with the constructor file:" !? How do you call Scons in your example !? On Windows ( = cygwin !? ) , Linux and on OS-X ? Scons is now part of numpy (svn), right ? (at least the scons version you mean) Thanks again, Sebastian Haase
Sebastian Haase wrote:
Hi Joris, this is a great ( short ) recipe ! Could you elaborate on the line "You need to compile myextension.cpp and make a shared library from it. The easiest way is to use Scons with the constructor file:" !? How do you call Scons in your example !? On Windows ( = cygwin !? ) , Linux and on OS-X ?
I think Joris mentioned scons because that's the easiest way to build a shared library in a cross platform way (taking care of -fPIC on unix, linker option, etc...). Scons is like make, and SConstruct files are like makefiles: you just call scons instead of make when you have a SConstruct file.
Scons is now part of numpy (svn), right ? (at least the scons version you mean)
I think this had nothing to do with using scons to build numpy/scipy per se (numpy svn does not include scons, BTW, only hooks to call scons from distutils). cheers, David
Hi David ! Is it possible to construct a Python module with Scons without tampering with different flags? I think you built your own builder (just like I did), but did you manage to put it in Scons 0.98 ? Matthieu 2008/4/23, David Cournapeau <david@ar.media.kyoto-u.ac.jp>:
Sebastian Haase wrote:
Hi Joris, this is a great ( short ) recipe ! Could you elaborate on the line "You need to compile myextension.cpp and make a shared library from it. The easiest way is to use Scons with the constructor file:" !? How do you call Scons in your example !? On Windows ( = cygwin !? ) , Linux and on OS-X ?
I think Joris mentioned scons because that's the easiest way to build a shared library in a cross platform way (taking care of -fPIC on unix, linker option, etc...).
Scons is like make, and SConstruct files are like makefiles: you just call scons instead of make when you have a SConstruct file.
Scons is now part of numpy (svn), right ? (at least the scons version you mean)
I think this had nothing to do with using scons to build numpy/scipy per se (numpy svn does not include scons, BTW, only hooks to call scons from distutils).
cheers,
David
_______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
-- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher
Matthieu Brucher wrote:
Hi David !
Is it possible to construct a Python module with Scons without tampering with different flags? I think you built your own builder (just like I did), but did you manage to put it in Scons 0.98 ?
Unfortunately, no. I had to make a choice on which features to push to a good enough quality for 0.98 timeframe, and better fortran support was more important than python extension builder for me (I can incorporate a python extension builder without touching scons sources, but fixing fortran support meant that my own scons was different than official scons, which is not good). Having a good python extension builder is also more difficult than I first expected, too (you can't retrieve options for MS compilers from distutils, for example and I would like to support building with and without distutils' help). cheers, David
Having a good python extension builder is also more difficult than I first expected, too (you can't retrieve options for MS compilers from distutils, for example and I would like to support building with and without distutils' help).
What are these options that must be retrieved? With a simple extension builder, we didn't encountered any bug although we have a fair number of extensions. Matthieu -- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher
Matthieu Brucher wrote:
Having a good python extension builder is also more difficult than I first expected, too (you can't retrieve options for MS compilers from distutils, for example and I would like to support building with and without distutils' help).
What are these options that must be retrieved?
the compiler, the compiler flags, the linker, the linker flags, the location of Python.h, etc...
With a simple extension builder, we didn't encountered any bug although we have a fair number of extensions.
Building a helloworld like extension or say numpy.core is kind of the same from a build point of view. The problem really is to work in many different configurations (for example supporting building python extensions for a different python than the one running scons), on different platforms. Waf (a project which started as a fork of scons and is now a totally different project for all purpose) has a python tool, and it is 300 lines of code: http://code.google.com/p/waf/source/browse/trunk/wafadmin/Tools/python.py And it has more support for this kind of things than scons. My current work for a python extension for scons is ~ 200 lines, not including unit tests. cheers, David
the compiler, the compiler flags, the linker, the linker flags, the location of Python.h, etc...
With a simple extension builder, we didn't encountered any bug although we have a fair number of extensions.
Building a helloworld like extension or say numpy.core is kind of the same from a build point of view. The problem really is to work in many different configurations (for example supporting building python extensions for a different python than the one running scons), on different platforms.
Indeed, if you want a different python, a more complex builder has to be written, I agree ;) Matthieu Waf (a project which started as a fork of scons and is now a totally
different project for all purpose) has a python tool, and it is 300 lines of code:
http://code.google.com/p/waf/source/browse/trunk/wafadmin/Tools/python.py
And it has more support for this kind of things than scons. My current work for a python extension for scons is ~ 200 lines, not including unit tests.
cheers,
David _______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
-- French PhD student Website : http://matthieu-brucher.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher
Hi Sebastian,
this is a great ( short ) recipe !
Thanks!
Could you elaborate on the line "You need to compile myextension.cpp and make a shared library from it. The easiest way is to use Scons with the constructor file: !?
David already gave the answer. Scons allows you to make shared libraries of your C++ code without having to worry about whether it should be a .dll, a .so, or a .dylab library. Plus, it's very easy to install. Cheers, Joris Disclaimer: http://www.kuleuven.be/cwis/email_disclaimer.htm
On Apr 22, 2008, at 9:56 PM, Joris De Ridder wrote:
On http://www.scipy.org/JorisDeRidder I've just put an example how I passed multidimensional Numpy arrays to C++ using ctypes. Perhaps it's helpful for your application. I didn't put it in the cookbook yet, because I would first like to test it a bit more. Up to now I didn't experience any bugs though.
Joris
As a total newbie in the field of extensions, where do I find ndarray.h and numpyctypes? Cheers Tommy
They are attached to the wiki page. Click on "Attachments" in the menu on the left. Joris On 23 Apr 2008, at 17:19, Tommy Grav wrote:
On Apr 22, 2008, at 9:56 PM, Joris De Ridder wrote:
On http://www.scipy.org/JorisDeRidder I've just put an example how I passed multidimensional Numpy arrays to C++ using ctypes. Perhaps it's helpful for your application. I didn't put it in the cookbook yet, because I would first like to test it a bit more. Up to now I didn't experience any bugs though.
Joris
As a total newbie in the field of extensions, where do I find ndarray.h and numpyctypes?
Cheers Tommy _______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Disclaimer: http://www.kuleuven.be/cwis/email_disclaimer.htm
On Apr 23, 2008, at 11:26 AM, Joris De Ridder wrote:
They are attached to the wiki page. Click on "Attachments" in the menu on the left.
Joris
Thanks. Didn't know that wiki's had that :) I tried you example on a Mac OS X 10.5.2 (I am not using Scons) and got [skathi:~/Work/myCode/pyCextensions] tgrav% gcc -dynamiclib myextension.cpp -o libmyextension.dylibUndefined symbols: "___gxx_personality_v0", referenced from: ___gxx_personality_v0$non_lazy_ptr in cct0CmDB.o ld: symbol(s) not found collect2: ld returned 1 exit status Anyone have any idea what I am doing wrong? Cheers Tommy
On Apr 23, 2008, at 9:50 AM, Tommy Grav wrote:
On Apr 23, 2008, at 11:26 AM, Joris De Ridder wrote:
They are attached to the wiki page. Click on "Attachments" in the menu on the left.
Joris
Thanks. Didn't know that wiki's had that :)
I tried you example on a Mac OS X 10.5.2 (I am not using Scons) and got
[skathi:~/Work/myCode/pyCextensions] tgrav% gcc -dynamiclib myextension.cpp -o libmyextension.dylibUndefined symbols: "___gxx_personality_v0", referenced from: ___gxx_personality_v0$non_lazy_ptr in cct0CmDB.o ld: symbol(s) not found collect2: ld returned 1 exit status
Anyone have any idea what I am doing wrong?
These "personality" errors indicate a version mismatch with the gcc compiilers. ** Bill Spotz ** ** Sandia National Laboratories Voice: (505)845-0170 ** ** P.O. Box 5800 Fax: (505)284-0154 ** ** Albuquerque, NM 87185-0370 Email: wfspotz@sandia.gov **
On 23 Apr 2008, at 17:50, Tommy Grav wrote:
On Apr 23, 2008, at 11:26 AM, Joris De Ridder wrote:
They are attached to the wiki page. Click on "Attachments" in the menu on the left.
Joris
Thanks. Didn't know that wiki's had that :)
I tried you example on a Mac OS X 10.5.2 (I am not using Scons) and got
[skathi:~/Work/myCode/pyCextensions] tgrav% gcc -dynamiclib myextension.cpp -o libmyextension.dylibUndefined symbols: "___gxx_personality_v0", referenced from: ___gxx_personality_v0$non_lazy_ptr in cct0CmDB.o ld: symbol(s) not found collect2: ld returned 1 exit status
Anyone have any idea what I am doing wrong?
Try g++ -o myextension.os -c -fPIC myextension.cpp g++ -o libmyextension.dylib -dynamiclib myextension.os on the bash prompt. (Or use Scons :-). Joris Disclaimer: http://www.kuleuven.be/cwis/email_disclaimer.htm
2008/4/23 Tommy Grav <tgrav@mac.com>:
On Apr 22, 2008, at 9:56 PM, Joris De Ridder wrote:
On http://www.scipy.org/JorisDeRidder I've just put an example how I passed multidimensional Numpy arrays to C++ using ctypes. Perhaps it's helpful for your application. I didn't put it in the cookbook yet, because I would first like to test it a bit more. Up to now I didn't experience any bugs though.
Joris
As a total newbie in the field of extensions, where do I find ndarray.h and numpyctypes?
import numpy as np print np.get_include() numpyctypes is np.ctypeslib, I assume. Regards Stéfan
On Tue, Apr 22, 2008 at 4:38 PM, Thomas Hrabe <thrabe@burnham.org> wrote:
Hi all!
I am currently developing a python module in C/C++ which is supposed to access nd arrays as defined by the following command in python
a = numpy.array([1,1,1])
I want to access the array the following way and use the nd array data for further processing in C.
mymod.doSthg(a)
The example code on http://numpy.sourceforge.net/numdoc/HTML/numdoc.htm#pgfId-36721
(!PyArg_ParseTuple(args, "O!",&PyArray_Type, &array))
does not work for nd arrays. I always get TypeError: argument 1 must be array, not numpy.ndarray
That page refers to Numeric, numpy's old predecessor. If you #included "Numeric/arrayobject.h" at the top of that C file, then the C extension is expecting a Numeric array rather than a numpy one. Here is more up-to-date documentation: http://projects.scipy.org/scipy/numpy/wiki/NumPyCAPI
I assume the error is the constant provided as the third parameter, saying that the imput is of PyArray_Type and no nd array.
So here are my questions: 1. is there any good tutorial / example code for acessing nd arrays in C? 2. what is the difference between both (arrays and nd arrays? - I am new to python and heard there were different approaches to arrays and that nd arrays work better for multi dimensional applications. Is this right?)
There is a bit of terminological confusion. The Python standard library has a module named "array" which provides homogeneous, typed 1D arrays. numpy provides a much richer array object (technically, numpy.ndarray, but you usually construct one using the numpy.array() function). If you have numpy, the standard library's array module is of very little use; you will almost always want to use numpy. This has nothing to do with the error you got above.
3. which one of both will be used in the future?
Well, the standard library's array module will continue to exist as will numpy. Numeric, the likely cause of your problem above, is no longer maintained, so please use numpy. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
Hello, On Tue, Apr 22, 2008 at 11:38 PM, Thomas Hrabe <thrabe@burnham.org> wrote:
I am currently developing a python module in C/C++ which is supposed to access nd arrays as defined by the following command in python
You might also be interested in: http://mathema.tician.de/software/pyublas Regards, Albert
On Mittwoch 23 April 2008, Albert Strasheim wrote:
Hello,
On Tue, Apr 22, 2008 at 11:38 PM, Thomas Hrabe <thrabe@burnham.org> wrote:
I am currently developing a python module in C/C++ which is supposed to access nd arrays as defined by the following command in python
You might also be interested in:
Argh-- you scooped me! :) I'm preparing version 0.92.1 right now, due out later today. It should iron out most of the wrinkles that are still present in the current version, 0.92. --Andreas
Hi all, first of all, thank you for the rich spectrum of answers, especially to Joris' sample code. What I still do not understand is the following: One can find multiple approaches to arrays, the old numeric array and the new ndarray. The new stuff seems to be backward compatible. However, where is a free documentation of the ndarray? Which headers do I have to use in order to access the ndarray object that, in my eyes, should be compatible wit PyObject in some way? I find definitions of the old array types, but none for the ndarray. I tend to give Joris code a try; but, however, the whole situation confueses me. Joris code (the ndarray.h) does not include any numpy object definition for instance. All I want to do is the following:
a = numpy.array([1,1,1]) a.__class__ <type 'numpy.ndarray'>
mymod.run(a ) run in C should look like this: PyObject run{ PyObject* ndArray; PyArg_ParseTuple( argv, ???? ,ndArray); for(x axis) for(y axis) do something with ndArray.data[x][y]; return something. } Is it possible to access the ndarray object like this? I do not want to create new array objects right now, later maybe. There must be a predefined way to access the data in C without using a custom hack. Finally, my primary aim is to access 3dimensional arrays of floats in C. Best, Thomas -----Original Message----- From: numpy-discussion-bounces@scipy.org on behalf of Thomas Hrabe Sent: Tue 4/22/2008 2:38 PM To: numpy-discussion@scipy.org Subject: [Numpy-discussion] access ndarray in C++ Hi all! I am currently developing a python module in C/C++ which is supposed to access nd arrays as defined by the following command in python a = numpy.array([1,1,1]) I want to access the array the following way and use the nd array data for further processing in C. mymod.doSthg(a) The example code on http://numpy.sourceforge.net/numdoc/HTML/numdoc.htm#pgfId-36721 (!PyArg_ParseTuple(args, "O!",&PyArray_Type, &array)) does not work for nd arrays. I always get TypeError: argument 1 must be array, not numpy.ndarray I assume the error is the constant provided as the third parameter, saying that the imput is of PyArray_Type and no nd array. So here are my questions: 1. is there any good tutorial / example code for acessing nd arrays in C? 2. what is the difference between both (arrays and nd arrays? - I am new to python and heard there were different approaches to arrays and that nd arrays work better for multi dimensional applications. Is this right?) 3. which one of both will be used in the future? Thank you in advance for your help, Thomas
Thomas Hrabe wrote:
One can find multiple approaches to arrays, the old numeric array and the new ndarray. The new stuff seems to be backward compatible.
mostly, yes. numpy (ndarray) is the only option going forward, as it's the only one being maintained.
However, where is a free documentation of the ndarray?
You are right, the most complete set is Travis' book. It's a pretty good deal when you consider that he wrote much of numpy -- if it saves you an hour, it's worth the money! That being said, I think you were pointed to: http://projects.scipy.org/scipy/numpy/wiki/NumPyCAPI Which should get you going, though it does seem to be missing your key question:
Which headers do I have to use in order to access the ndarray object that, in my eyes, should be compatible wit PyObject in some way?
#include <numpy/arrayobject.h> You should use distutils to compile your extension, and in your setup.py, you can put: include_dirs=[numpy.get_include()] and the header should be found.
run in C should look like this: PyObject run{ PyObject* ndArray; PyArg_ParseTuple( argv, ???? ,ndArray);
I think you had that right before -- you were just using the wrong headers: PyArg_ParseTuple(args, "O!",&PyArray_Type, &array)
Is it possible to access the ndarray object like this?
yes, something like that certainly should work. When you've got it worked out, please add it to the Wiki. I'd poke around the wiki, and this mailing list archives, for more examples. NOTE: Most folks now think that the pain of writing extensions completely by hand is not worth it -- it's just too easy to make reference counting mistakes, etc. Most folks are now using one of: Cython (or Pyrex) SWIG ctypes For example, the SWIG typemaps distributed with numpy make it very easy to pass a numpy array into a C function such as: void MyFunc(int i, int j, int k, double *arr) where *arr is a pointer to a block of doubles that represent an (i,j,k) 3-d array. Equally easy methods are available with Cython and ctypes. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
On Mittwoch 23 April 2008, Christopher Barker wrote:
NOTE: Most folks now think that the pain of writing extensions completely by hand is not worth it -- it's just too easy to make reference counting mistakes, etc. Most folks are now using one of:
Cython (or Pyrex) SWIG ctypes
IMO, all of these deal better with C than they do with C++. There is also a number of more C++-affine solutions: - Boost Python [1]. Especially if you want usable C++ integration. (ie. more than basic templates, etc.) - sip [2]. Used for PyQt. Andreas [1] http://www.boost.org/doc/libs/1_35_0/libs/python/doc/index.html [2] http://www.riverbankcomputing.co.uk/sip/index.php
Andreas Klöckner wrote:
IMO, all of these deal better with C than they do with C++.
True, though SWIG works OK with C++.
There is also a number of more C++-affine solutions:
- Boost Python [1]. Especially if you want usable C++ integration. (ie. more than basic templates, etc.)
What's the status of the Boost array object? maintained? updated for recent numpy?
- sip [2]. Used for PyQt.
Any numpy-specific stuff for sip? -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
On Mittwoch 23 April 2008, Christopher Barker wrote:
What's the status of the Boost array object? maintained? updated for recent numpy?
The numeric.hpp included in Boost.Python is a joke. It does not use the native API. PyUblas [1] fills this gap, by allowing you to use Boost.Ublas on the C++ side and Numpy on the Python side. It is somewhat like what Hoyt describes, except for a different environment. Here's a table: | Hoyt | Andreas -------------------+------------+-------- C++ Matrix Library | Blitz++ | Boost.Ublas Wrapper Generator | Weave | Boost.Python Wrapper | w_wrap.tgz | PyUblas :) Sorry, that was too much fun to pass up. [1] http://tiker.net/doc/pyublas/index.html
- sip [2]. Used for PyQt.
Any numpy-specific stuff for sip?
Not as far as I'm aware. In fact, I don't know of any uses of sip outside of Qt/KDE-related things. Andreas
On Wed, Apr 23, 2008 at 09:47:46PM -0400, Andreas Klöckner wrote:
Any numpy-specific stuff for sip?
Not as far as I'm aware. In fact, I don't know of any uses of sip outside of Qt/KDE-related things.
Airbus uses it for heavy numerical work. They claim they have benchmarked all the tools and SIP was the fastest. If you want more information on that, you should contact the sip developer, Phil Thompson, he does some contracting job for Airbus. Cheers, Gaël
Let me say also that I tried using boost.python a while back to interface numpy with c++, and, while I got some things working, I found the distribution and packaging end of things an order of magnitude more complicated than what I found with weave. Since weave is built into scipy, as well as blitz itself (now recently under the BSD license), it fits really well with numpy's distutils. The only dependencies on the user end are numpy and scipy, which they would need anyway from the python end of the code. Also, I find the syntax of blitz++ to be really simple and numpy-like. E.g. Array<double, 2> A; A.resize(20,20); for(int i = 0; i < 20; ++i) A(i, Range::all() ) = i; Array<double, 1> B = A(Range(2,18), 0); B *= 2; A(2, Range(5,10) ) += 2; Array<double, 1> C = B*A(2, Range(1,17) ); and so on... (I just typed this in here, prob some typos). Anyway, after bouncing around for a bit I think I've found this to be the solution that most closely fits my needs. --Hoyt On Sat, Apr 26, 2008 at 3:54 AM, Gael Varoquaux <gael.varoquaux@normalesup.org> wrote:
On Wed, Apr 23, 2008 at 09:47:46PM -0400, Andreas Klöckner wrote:
Any numpy-specific stuff for sip?
Not as far as I'm aware. In fact, I don't know of any uses of sip outside of Qt/KDE-related things.
Airbus uses it for heavy numerical work. They claim they have benchmarked all the tools and SIP was the fastest. If you want more information on that, you should contact the sip developer, Phil Thompson, he does some contracting job for Airbus.
Cheers,
Gaël
_______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
-- +++++++++++++++++++++++++++++++++++ Hoyt Koepke UBC Department of Computer Science http://www.cs.ubc.ca/~hoytak/ hoytak@gmail.com +++++++++++++++++++++++++++++++++++
On Wednesday 23 April 2008 15:48:23 Christopher Barker wrote:
- Boost Python [1]. Especially if you want usable C++ integration. (ie. more than basic templates, etc.)
What's the status of the Boost array object? maintained? updated for recent numpy?
The boost.python array object is still maintained. However, it has a few problems: 1. All array operations go through python which makes it too slow for my purposes. Phil Austin posted an alternate class on this list which works well since it uses the numpy C API: http://www.eos.ubc.ca/research/clouds/software/pythonlibs/num_util/num_util_... 2. Only numeric & numarray are supported out of the box, but it is simple to support numpy; just add the following after calling import_array in your extension module: boost::python::numeric::array::set_module_and_type( "numpy", "ndarray" ); 3. If you want the C++-way of dealing with numpy matrices & vectors directly as objects look at either of the following: http://mail.python.org/pipermail/cplusplus-sig/2008-October/013825.html http://mathema.tician.de/software/pyublas Of course, I am biased towards the first approach. Regards, Ravi
Oops, please ignore my previous message. I just started using a new mail client which marked some of my old messages (which I had tagged interesting) the same as new messages and I just blindly replied to them without checking the date. Sorry about the spam. Ravi
Just to add something from personal experience in case it's useful... I write a lot of code that works on numpy arrays that goes between python and c++ (too used to OO to stick with pure c). I've been using scipy.weave to interface to blitz++ arrays in my c++ code. The blitz++ library has been wonderful in my experience -- it supports arrays up to 11 dimensions, slicing, etc. -- essentially a very useful subset of numpy's operations. It's also nice because it can interface without copying any data; it just provides a wrapper class that sits on a numpy array. The library is included as part of scipy. I recently wrote a function that automatically generates wrappers for a c++ function that accepts blitz++ arrays which I recently posted to the scipy list. You just specify function names and the list of argument types (nparrays included) and it generates the wrapper code. Quite easy. It's pertinent to the discussion here, so I'll send it again. Included is a full working example with setup.py. let me know if it helps/is confusing/doesn't work/etc. --Hoyt On Wed, Apr 23, 2008 at 12:30 PM, Andreas Klöckner <lists@informa.tiker.net> wrote:
On Mittwoch 23 April 2008, Christopher Barker wrote:
NOTE: Most folks now think that the pain of writing extensions completely by hand is not worth it -- it's just too easy to make reference counting mistakes, etc. Most folks are now using one of:
Cython (or Pyrex) SWIG ctypes
IMO, all of these deal better with C than they do with C++. There is also a number of more C++-affine solutions:
- Boost Python [1]. Especially if you want usable C++ integration. (ie. more than basic templates, etc.)
- sip [2]. Used for PyQt.
Andreas
[1] http://www.boost.org/doc/libs/1_35_0/libs/python/doc/index.html [2] http://www.riverbankcomputing.co.uk/sip/index.php
_______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
-- +++++++++++++++++++++++++++++++++++ Hoyt Koepke UBC Department of Computer Science http://www.cs.ubc.ca/~hoytak/ hoytak@gmail.com +++++++++++++++++++++++++++++++++++
Christopher Barker wrote:
Thomas Hrabe wrote: I'd poke around the wiki, and this mailing list archives, for more examples.
This may help: http://scipy.org/Cookbook/C_Extensions -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
Finally! http://scipy.org/Cookbook/C_Extensions is a nice example. I have discovered a bug in ndarrayobject.h by the way. I do not know which numpy version we have installed here, but when compiling my c code with -pendantic I got this error: /home/global/python32/lib/python2.4/site-packages/numpy/core/include/numpy/ndarrayobject.h:192: error: comma at end of enumerator list /home/global/python32/lib/python2.4/site-packages/numpy/core/include/numpy/ndarrayobject.h:199: error: comma at end of enumerator list /home/global/python32/lib/python2.4/site-packages/numpy/core/include/numpy/ndarrayobject.h:211: error: comma at end of enumerator list I have no access to the newest numpy version, so if anybody has, please fix it... (in case its not on full purpose for something...) Thank you and I bet I will post here again soon, Thomas -----Original Message----- From: numpy-discussion-bounces@scipy.org on behalf of Christopher Barker Sent: Wed 4/23/2008 1:15 PM To: Discussion of Numerical Python Subject: Re: [Numpy-discussion] access ndarray in C++ Christopher Barker wrote:
Thomas Hrabe wrote: I'd poke around the wiki, and this mailing list archives, for more examples.
This may help: http://scipy.org/Cookbook/C_Extensions -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov _______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
On Wed, Apr 23, 2008 at 3:46 PM, Thomas Hrabe <thrabe@burnham.org> wrote:
I have discovered a bug in ndarrayobject.h by the way. I do not know which numpy version we have installed here, but when compiling my c code with -pendantic I got this error:
/home/global/python32/lib/python2.4/site-packages/numpy/core/include/numpy/ndarrayobject.h:192: error: comma at end of enumerator list
/home/global/python32/lib/python2.4/site-packages/numpy/core/include/numpy/ndarrayobject.h:199: error: comma at end of enumerator list
/home/global/python32/lib/python2.4/site-packages/numpy/core/include/numpy/ndarrayobject.h:211: error: comma at end of enumerator list
I have no access to the newest numpy version, so if anybody has, please fix it... (in case its not on full purpose for something...)
Fixed in SVN, thank you. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
participants (15)
-
Albert Strasheim
-
Andreas Klöckner
-
Bill Spotz
-
Christopher Barker
-
David Cournapeau
-
Gael Varoquaux
-
Hoyt Koepke
-
Joris De Ridder
-
Matthieu Brucher
-
Ravi
-
Robert Kern
-
Sebastian Haase
-
Stéfan van der Walt
-
Thomas Hrabe
-
Tommy Grav