2009/12/9 Dr. Phillip M. Feldman
Pauli Virtanen-3 wrote:
Nevertheless, I can't really regard dropping the imaginary part a significant issue.
I am amazed that anyone could say this. For anyone who works with Fourier transforms, or with electrical circuits, or with electromagnetic waves, dropping the imaginary part is a huge issue because we get answers that are totally wrong.
I agree that dropping the imaginary part is a wart. But it is one that is not very hard to learn to live with. I say this as someone who has been burned by it while using Fourier analysis to work with astronomical data.
When I recently tried to validate a code, the answers were wrong, and it took two full days to track down the cause. I am now forced to reconsider carefully whether Python/NumPy is a suitable platform for serious scientific computing.
While I find the current numpy complex->real conversion annoying, I have to say, this kind of rhetoric does not benefit your cause. It sounds childish and manipulative, and makes even people who agree in principle want to tell you to go ahead and use MATLAB and stop pestering us. We are not here to sell you on numpy; if you hate it, don't use it. We are here because *we* use it, warts and all, and we want to discuss interesting topics related to numpy. That you would have implemented it differently is not very interesting if you are not even willing to understand why it is the way it is and what a change would cost, let alone propose a workable way to improve. Anne