[Numpy-discussion] More complex data types

Neal Becker ndbecker2 at gmail.com
Fri Oct 5 11:48:39 EDT 2007


Charles R Harris wrote:

> On 10/5/07, Neal Becker <ndbecker2 at gmail.com> wrote:
>>
>> I'm thinking (again) about using numpy for signal processing
>> applications. One issue is that there are more data types that are
>> commonly used in signal processing that are not available in numpy (or
>> python). Specifically, it is frequently required to convert floating
>> point
>> algorithms into integer algorithms.  numpy is fine for arrays of integers
>> (of various sizes), but it is also very useful to have arrays of
>> complex<integers>.  While numpy has complex<double,float>, it doesn't
>> have
>> complex<int,int_64...>  Has anyone thought about this?
> 
> 
> A bit. Multiplication begins to be a problem, though. Would you also want
> fixed point multiplication with scaling, a la PPC with altivec? What about
> division? So on and so forth. I think something like this would best be
> implemented in a specialized signal processing package but I am not sure
> of the best way to do it.
> 

I'd keep things as simple as possible.  No fixed point/scaling.  It's simple
enough to explictly rescale things as you wish.

That is (using c++ syntax):
complex<int> a, b;
complex<int> c = a * b;
complex<int> d = d >> 4;

Complicating life is interoperability (conversion) of types.

I've used this concept for some years with c++/python - but not with numpy. 
It's pretty trivial to make a complex<int> type as a C extension to python. 
Adding this to numpy would be really useful.




More information about the NumPy-Discussion mailing list