[Numpy-discussion] Assigning complex value to real array

Andrew P. Mullhaupt doc at zen-pharaohs.com
Thu Oct 7 12:08:09 EDT 2010


  On 10/7/2010 1:31 AM, Charles R Harris wrote:
>
>
> On Wed, Oct 6, 2010 at 11:07 PM, Andrew P. Mullhaupt 
> <doc at zen-pharaohs.com <mailto:doc at zen-pharaohs.com>> wrote:
>
>
>     I came across this gem yesterday
>
>
>           >>> from numpy import *
>           >>> R = ones((2))
>           >>> R[0] = R[0] * 1j
>           >>> R
>           ...array([ 0., 1.])
>           >>> R = ones((2), 'complex')
>           >>> R[0] = R[0] * 1j
>           >>> R
>           array([ 0.+1.j, 1.+0.j])"
>
>     and I read that this behavior is actually intended for some reason
>     about how Python wants relations between types to be such that
>     this mistake is unavoidable.
>
>
> It's because an element of a real array only has space for a real and 
> you can't fit a complex in there. Some other software which is less 
> strict about types may allow such things, but it comes at a cost.

A cost that in any particular instance drops by a somthing like a factor 
of two every year.

But the cost of causing someone to chase down a bug hiding in this 
particular bush is a human cost, so, if anything, over time that cost 
increases with the rate of inflation.

>
>     So can we have a new abstract floating type which is a complex,
>     but is implemented so that the numbers where the imaginary part is
>     zero represent and operate on that imaginary part implicitly? By
>     containing all these semantics within one type, then presumably we
>     avoid problems with ideas relationships between types.
>
>
> Changing types like you propose would require making a new copy or 
> reserving space ahead of time
>

No. You can define the arrays as backed by mapped files with real and 
imaginary parts separated. Then the imaginary part, being initially 
zero, is a sparse part of the file, takes only a fraction of the space 
(and, on decent machine doesn't incur memory bandwidth costs either). 
You can then slipstream the cost of testing for whether the imaginary 
part has been subsequently assigned to zero (so you can re-sparsify the 
representation of a page) with any operation that examines all the 
values on that page. Consistency would be provided by the OS, so there 
wouldn't really be much numpy-specific code involved.

So there is at least one efficient way to implement my suggestion.

Best regards,
Andrew Mullhaupt
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/numpy-discussion/attachments/20101007/b74b0e57/attachment.html>


More information about the NumPy-Discussion mailing list