
David Cournapeau skrev:
- I would also prefer having C instead of C++ as well - in this case, C++ does not bring much since we have our "templating" system and you don't use the STL much.
It was mainly for complex numbers, since MSVC does not support ISO C.
- In any case, please do not use exception, it is not portable.
Ok, the STL containers like vector<> can throw exceptions like std::bad_alloc. That is why I did this.
- you cannot use Py_ssize_t, as it is python 2.5 >= feature - there is nothing wrong with npy_intp, I don't understand your comment.
Yes there is, cf. PEP 353. Using Py_intptr_t for indexing would depend on sizeof(void*) == sizeof(size_t), which the C standard does not mandate. It can differ on segment and offset architectures. Two examples are 16-bit x86 (cf. far and near pointers) and x86 with 36-bit PAE. It accidentally works for flat 32- and 64-bit address spaces. This is why Python uses Py_ssize_t instead of Py_intptr_t. And as it happens, npy_intp is typedef'ed to the latter. (This might be pedantic, but it is a formal error.)
- using cython is good - that could be a first patch, to replace the existing manual wrapping by cython. You can declare pointer without trouble in cython, No, you cannot create pointers to a variable declared object. This is illegal Cython:
cdef object *ptr # would simliar to PyObject **ptr in C So if we want to filter with dtype object, one could use Dag Sverre's numpy syntax and fake "cdef object *ptr" with "cdef np.ndarray[object] ptr", but it would not be efficient. We have to store two z-buffers and swap them for each filter step. The other option is to use "cdef PyObject**" instead of "cdef object *" in Cython, but then Cython will not do reference counting. S.M.