[pypy-dev] Questions on the pypy+numpy project
Gary Robinson
garyrob at me.com
Wed Oct 19 15:38:22 CEST 2011
>> You would like pypy+numpy+scipy so that you could write fast
>> python-only algorithms and still use the existing libraries. I
>> suppose this is a perfectly reasonable usecase, and indeed
>> the current plan does not focus on this.
>
Yes. That is exactly what I want.
> However, I'd like to underline that to write "fast python-only algorithms", you most probably still need a fast numpy in the way it is written right now (unless you want to write your algorithms without using numpy at all)
I make very little use of numpy itself other than as the way to use scipy; I tend to write python-only algorithms that don't use numpy. As Peter Cock says in his own reply, a little bit of slowdown in regular numpy use compared to CPython would be fine, though a LOT of slowdown could be a problem.
Now, I'm not saying I'm typical. I have no idea how typical I am, though it sounds like Peter Cock is in a similar boat. I'm sure I'd benefit from doing more with numpy. But I simply cannot do without scipy, or accessing equivalent functionality by using R or another package. I'd much rather use scipy and see its capabilities grow than use R.
>From my own bias, I'd assume that what would benefit the scientific community most is scipy integration first, and a faster numpy second. Scipy simply provides too many tools that are absolutely essential.
The project for providing a common interface to IronPython, etc. sounded extremely promising in that regard -- it makes enormous sense to me that all different versions of python should have a way to access scipy, even if custom code that uses numpy is a little bit slower. My main concern is that the glue to frequently-called scipy functions such as scipy.stats.stats.chisqprob wouldn't be so much slower that my overall script isn't benefiting from PyPy.
Obviously, I understand that this is an open-source project and people develop what they are interested in. I'm just giving my individual perspective, for whatever it may be worth.
--
Gary Robinson
CTO
Emergent Discovery, LLC
personal email: garyrob at me.com
work email: grobinson at emergentdiscovery.com
Company: http://www.emergentdiscovery.com
Blog: http://www.garyrobinson.net
On Oct 19, 2011, at 7:57 AM, Antonio Cuni wrote:
> On 19/10/11 13:42, Antonio Cuni wrote:
>
>> I'm not sure to interpret your sentence correctly.
>> Are you saying that you would still want a pypy+numpy+scipy, even if it ran
>> things slower than CPython? May I ask why?
>
> ah sorry, I think I misunderstood your email.
>
> You would like pypy+numpy+scipy so that you could write fast python-only algorithms and still use the existing libraries. I suppose this is a perfectly reasonable usecase, and indeed the current plan does not focus on this.
>
> However, I'd like to underline that to write "fast python-only algorithms", you most probably still need a fast numpy in the way it is written right now (unless you want to write your algorithms without using numpy at all). If we went to the slow-but-scipy-compatible approach, any pure python algorithm which interfaces with numpy arrays would be terribly slow.
>
> ciao,
> Anto
More information about the pypy-dev
mailing list