Re: [pypy-dev] Questions on the pypy+numpy project

Jacob Hall?n, 18.10.2011 18:41:
I'd just like to note that the compelling reason for PyPy to develop numpy support is popular demand. We did a survey last spring, in which an overwhelming number of people asked for numpy support. This indicates that there is a large group of people who will be reap benefits from using PyPy plus Numpy, without specific support for scipy packages.
Depends on what the question was. Many people say "NumPy", and when you ask back, you find out that they actually meant "SciPy" or at least "NumPy and parts x, y and z of its ecosystem that I commonly use…
I was one of the people who responded to that poll, and I have to say that I fall into the category "they actually meant 'SciPy'…". I assumed that there would be an interface to numpy that would also support scipy. SciPy has a lot of packages that run various things like SVD very, efficiently because it does them in C. I need access to those packages. I also write my own algorithms. For those, I want to benefit from PyPy's speed and don't necessarily want to make the algorithms fit into numpy's array-processing approach. So, I NEED SciPy, and would like to also have PyPy, and I'd like to use them together rather than having to separate everything into separate scripts, some of which use CPython/SciPy and some of which use PyPy. In fact, my current code doesn't need NumPy at all except as the way to get to SciPy. So, I have to say, I am unhappy with the current PyPy approach to NumPy. I'd rather see a much slower NumPy/PyPy integration if that meant being able to use SciPy seamlessly with PyPy. -- Gary Robinson CTO Emergent Discovery, LLC personal email: garyrob@me.com work email: grobinson@emergentdiscovery.com Company: http://www.emergentdiscovery.com Blog: http://www.garyrobinson.net

Hi Gary, On 19/10/11 12:35, Gary Robinson wrote:
So, I have to say, I am unhappy with the current PyPy approach to NumPy. I'd rather see a much slower NumPy/PyPy integration if that meant being able to use SciPy seamlessly with PyPy.
I'm not sure to interpret your sentence correctly. Are you saying that you would still want a pypy+numpy+scipy, even if it ran things slower than CPython? May I ask why? ciao, Anto

On 19/10/11 13:42, Antonio Cuni wrote:
I'm not sure to interpret your sentence correctly. Are you saying that you would still want a pypy+numpy+scipy, even if it ran things slower than CPython? May I ask why?
ah sorry, I think I misunderstood your email. You would like pypy+numpy+scipy so that you could write fast python-only algorithms and still use the existing libraries. I suppose this is a perfectly reasonable usecase, and indeed the current plan does not focus on this. However, I'd like to underline that to write "fast python-only algorithms", you most probably still need a fast numpy in the way it is written right now (unless you want to write your algorithms without using numpy at all). If we went to the slow-but-scipy-compatible approach, any pure python algorithm which interfaces with numpy arrays would be terribly slow. ciao, Anto

On Wed, Oct 19, 2011 at 12:57 PM, Antonio Cuni <anto.cuni@gmail.com> wrote:
On 19/10/11 13:42, Antonio Cuni wrote:
I'm not sure to interpret your sentence correctly. Are you saying that you would still want a pypy+numpy+scipy, even if it ran things slower than CPython? May I ask why?
ah sorry, I think I misunderstood your email.
You would like pypy+numpy+scipy so that you could write fast python-only algorithms and still use the existing libraries. I suppose this is a perfectly reasonable usecase, and indeed the current plan does not focus on this.
I want this too - well actually pypy+numpy+xxx where xxx uses bits of the numpy C API. I don't care if the numpy bits are a *bit* slower under PyPy than C Python - 100% compatibility is more important to me.
However, I'd like to underline that to write "fast python-only algorithms", you most probably still need a fast numpy in the way it is written right now (unless you want to write your algorithms without using numpy at all). If we went to the slow-but-scipy-compatible approach, any pure python algorithm which interfaces with numpy arrays would be terribly slow.
I'd be happy with "close to numpy under C Python" speeds for my code using numpy under PyPy, with fast python-only bits. That covers quite a lot of use cases I would think, but if we'd get "terribly slow" for the numpy using bits that is less tempting. Depending on your value of terrible ;) Right now the PyPy micronumpy is far too limited to be of real use even where I'm using only the Python interface. e.g. there is no numpy.linalg module: https://bugs.pypy.org/issue915 Peter

You would like pypy+numpy+scipy so that you could write fast python-only algorithms and still use the existing libraries. I suppose this is a perfectly reasonable usecase, and indeed the current plan does not focus on this.
Yes. That is exactly what I want.
However, I'd like to underline that to write "fast python-only algorithms", you most probably still need a fast numpy in the way it is written right now (unless you want to write your algorithms without using numpy at all)
I make very little use of numpy itself other than as the way to use scipy; I tend to write python-only algorithms that don't use numpy. As Peter Cock says in his own reply, a little bit of slowdown in regular numpy use compared to CPython would be fine, though a LOT of slowdown could be a problem. Now, I'm not saying I'm typical. I have no idea how typical I am, though it sounds like Peter Cock is in a similar boat. I'm sure I'd benefit from doing more with numpy. But I simply cannot do without scipy, or accessing equivalent functionality by using R or another package. I'd much rather use scipy and see its capabilities grow than use R.
From my own bias, I'd assume that what would benefit the scientific community most is scipy integration first, and a faster numpy second. Scipy simply provides too many tools that are absolutely essential.
The project for providing a common interface to IronPython, etc. sounded extremely promising in that regard -- it makes enormous sense to me that all different versions of python should have a way to access scipy, even if custom code that uses numpy is a little bit slower. My main concern is that the glue to frequently-called scipy functions such as scipy.stats.stats.chisqprob wouldn't be so much slower that my overall script isn't benefiting from PyPy. Obviously, I understand that this is an open-source project and people develop what they are interested in. I'm just giving my individual perspective, for whatever it may be worth. -- Gary Robinson CTO Emergent Discovery, LLC personal email: garyrob@me.com work email: grobinson@emergentdiscovery.com Company: http://www.emergentdiscovery.com Blog: http://www.garyrobinson.net On Oct 19, 2011, at 7:57 AM, Antonio Cuni wrote:
On 19/10/11 13:42, Antonio Cuni wrote:
I'm not sure to interpret your sentence correctly. Are you saying that you would still want a pypy+numpy+scipy, even if it ran things slower than CPython? May I ask why?
ah sorry, I think I misunderstood your email.
You would like pypy+numpy+scipy so that you could write fast python-only algorithms and still use the existing libraries. I suppose this is a perfectly reasonable usecase, and indeed the current plan does not focus on this.
However, I'd like to underline that to write "fast python-only algorithms", you most probably still need a fast numpy in the way it is written right now (unless you want to write your algorithms without using numpy at all). If we went to the slow-but-scipy-compatible approach, any pure python algorithm which interfaces with numpy arrays would be terribly slow.
ciao, Anto

Hello Gary, On 19/10/11 15:38, Gary Robinson wrote:
You would like pypy+numpy+scipy so that you could write fast python-only algorithms and still use the existing libraries. I suppose this is a perfectly reasonable usecase, and indeed the current plan does not focus on this.
Yes. That is exactly what I want. [cut]
thank you for the input: indeed, I agree that for your usecase the current plan is not the best. OTOH, there is probably someone else for which the current plan is better than others, we cannot make everyone happy at the same time, although we might do it eventually :-). By the way, did you ever considered the possibility of running pypy and cpython side-by-side? You do your pure-python computation on pypy, then you pipe them (e.g. by using execnet) to a cpython process which does the processing using scipy. Depending on how big the data is, the overhead of passing the data around should not be too high It's not ideal, but it might be worth of being tried. ciao, Anto

By the way, did you ever considered the possibility of running pypy and cpython side-by-side? You do your pure-python computation on pypy, then you pipe them (e.g. by using execnet) to a cpython process which does the processing using scipy. Depending on how big the data is, the overhead of passing the data around should not be too high . Absolutely -- I've thought about that general approach though this is the first time I recall hearing about execnet. Of course I'm concerned that the overhead would be too much in some cases, such as huge numbers of calls to scipy.stats.stats.chisqprob. Such overhead seems like it might cancel all the benefit of PyPy, depending on the script. But maybe it's not as much overhead as I fear. For example I see that execnet does not do pickling. Hm.
-- Gary Robinson CTO Emergent Discovery, LLC personal email: garyrob@me.com work email: grobinson@emergentdiscovery.com Company: http://www.emergentdiscovery.com Blog: http://www.garyrobinson.net On Oct 19, 2011, at 10:27 AM, Antonio Cuni wrote:
Hello Gary,
On 19/10/11 15:38, Gary Robinson wrote:
You would like pypy+numpy+scipy so that you could write fast python-only algorithms and still use the existing libraries. I suppose this is a perfectly reasonable usecase, and indeed the current plan does not focus on this.
Yes. That is exactly what I want. [cut]
thank you for the input: indeed, I agree that for your usecase the current plan is not the best. OTOH, there is probably someone else for which the current plan is better than others, we cannot make everyone happy at the same time, although we might do it eventually :-).
By the way, did you ever considered the possibility of running pypy and cpython side-by-side? You do your pure-python computation on pypy, then you pipe them (e.g. by using execnet) to a cpython process which does the processing using scipy. Depending on how big the data is, the overhead of passing the data around should not be too high
It's not ideal, but it might be worth of being tried.
ciao, Anto

On Wed, Oct 19, 2011 at 10:38 -0400, Gary Robinson wrote:
By the way, did you ever considered the possibility of running pypy and cpython side-by-side? You do your pure-python computation on pypy, then you pipe them (e.g. by using execnet) to a cpython process which does the processing using scipy. Depending on how big the data is, the overhead of passing the data around should not be too high . Absolutely -- I've thought about that general approach though this is the first time I recall hearing about execnet. Of course I'm concerned that the overhead would be too much in some cases, such as huge numbers of calls to scipy.stats.stats.chisqprob. Such overhead seems like it might cancel all the benefit of PyPy, depending on the script. But maybe it's not as much overhead as I fear. For example I see that execnet does not do pickling. Hm.
You can add pickling on top of execnet by using dumps/loads and sending the bytes which is fast. I recommend to use pickling with great care and not everywhere though. holger
--
Gary Robinson CTO Emergent Discovery, LLC personal email: garyrob@me.com work email: grobinson@emergentdiscovery.com Company: http://www.emergentdiscovery.com Blog: http://www.garyrobinson.net
On Oct 19, 2011, at 10:27 AM, Antonio Cuni wrote:
Hello Gary,
On 19/10/11 15:38, Gary Robinson wrote:
You would like pypy+numpy+scipy so that you could write fast python-only algorithms and still use the existing libraries. I suppose this is a perfectly reasonable usecase, and indeed the current plan does not focus on this.
Yes. That is exactly what I want. [cut]
thank you for the input: indeed, I agree that for your usecase the current plan is not the best. OTOH, there is probably someone else for which the current plan is better than others, we cannot make everyone happy at the same time, although we might do it eventually :-).
By the way, did you ever considered the possibility of running pypy and cpython side-by-side? You do your pure-python computation on pypy, then you pipe them (e.g. by using execnet) to a cpython process which does the processing using scipy. Depending on how big the data is, the overhead of passing the data around should not be too high
It's not ideal, but it might be worth of being tried.
ciao, Anto
_______________________________________________ pypy-dev mailing list pypy-dev@python.org http://mail.python.org/mailman/listinfo/pypy-dev

I wonder if it would be worthwhile to have another poll, this time clearly differentiating between a) focusing on integrating the existing numpy in such a way that scipy and other such packages are also enabled, probably using the existing project to provide a C interface that IronPython and other Python variants can use; or b) the current path of replacing much of numpy, making it much faster but leaving scipy out in the cold for quite some time. I don't think it's clear, at this point, which approach would generate more monetary contributions. I suspect it might be (a) because of commercial scientific research that depends on scipy. Of course, if the path decision is already firm, then such a poll would be moot. -- Gary Robinson CTO Emergent Discovery, LLC personal email: garyrob@me.com work email: grobinson@emergentdiscovery.com Company: http://www.emergentdiscovery.com Blog: http://www.garyrobinson.net On Oct 19, 2011, at 10:27 AM, Antonio Cuni wrote:
Hello Gary,
On 19/10/11 15:38, Gary Robinson wrote:
You would like pypy+numpy+scipy so that you could write fast python-only algorithms and still use the existing libraries. I suppose this is a perfectly reasonable usecase, and indeed the current plan does not focus on this.
Yes. That is exactly what I want. [cut]
thank you for the input: indeed, I agree that for your usecase the current plan is not the best. OTOH, there is probably someone else for which the current plan is better than others, we cannot make everyone happy at the same time, although we might do it eventually :-).
By the way, did you ever considered the possibility of running pypy and cpython side-by-side? You do your pure-python computation on pypy, then you pipe them (e.g. by using execnet) to a cpython process which does the processing using scipy. Depending on how big the data is, the overhead of passing the data around should not be too high
It's not ideal, but it might be worth of being tried.
ciao, Anto

On Wed, Oct 19, 2011 at 4:49 PM, Gary Robinson <garyrob@me.com> wrote:
I wonder if it would be worthwhile to have another poll, this time clearly differentiating between
a) focusing on integrating the existing numpy in such a way that scipy and other such packages are also enabled, probably using the existing project to provide a C interface that IronPython and other Python variants can use; or
b) the current path of replacing much of numpy, making it much faster but leaving scipy out in the cold for quite some time.
I don't think it's clear, at this point, which approach would generate more monetary contributions. I suspect it might be (a) because of commercial scientific research that depends on scipy. Of course, if the path decision is already firm, then such a poll would be moot.
It's however clear which approach is harder and more painful. I for one don't subscribe for emulating all the subtleties of CPython C API nor numpy API.

On Wed, Oct 19, 2011 at 6:25 PM, Maciej Fijalkowski <fijall@gmail.com> wrote:
On Wed, Oct 19, 2011 at 4:49 PM, Gary Robinson <garyrob@me.com> wrote:
I wonder if it would be worthwhile to have another poll, this time clearly differentiating between
a) focusing on integrating the existing numpy in such a way that scipy and other such packages are also enabled, probably using the existing project to provide a C interface that IronPython and other Python variants can use; or
b) the current path of replacing much of numpy, making it much faster but leaving scipy out in the cold for quite some time.
I don't think it's clear, at this point, which approach would generate more monetary contributions. I suspect it might be (a) because of commercial scientific research that depends on scipy. Of course, if the path decision is already firm, then such a poll would be moot.
It's however clear which approach is harder and more painful. I for one don't subscribe for emulating all the subtleties of CPython C API nor numpy API.
If you do that, you are not porting numpy, and the current code-name of micronumpy is quite appropriate ;) I would be much more interested in (a), since as I understand it (b) would only cater to libraries using just numpy's python interface (and even there PyPy still has a lot of work to do). Peter

On Wed, Oct 19, 2011 at 3:36 PM, Peter Cock <p.j.a.cock@googlemail.com> wrote:
On Wed, Oct 19, 2011 at 6:25 PM, Maciej Fijalkowski <fijall@gmail.com> wrote:
On Wed, Oct 19, 2011 at 4:49 PM, Gary Robinson <garyrob@me.com> wrote:
I wonder if it would be worthwhile to have another poll, this time clearly differentiating between
a) focusing on integrating the existing numpy in such a way that scipy and other such packages are also enabled, probably using the existing project to provide a C interface that IronPython and other Python variants can use; or
b) the current path of replacing much of numpy, making it much faster but leaving scipy out in the cold for quite some time.
I don't think it's clear, at this point, which approach would generate more monetary contributions. I suspect it might be (a) because of commercial scientific research that depends on scipy. Of course, if the path decision is already firm, then such a poll would be moot.
It's however clear which approach is harder and more painful. I for one don't subscribe for emulating all the subtleties of CPython C API nor numpy API.
If you do that, you are not porting numpy, and the current code-name of micronumpy is quite appropriate ;)
I would be much more interested in (a), since as I understand it (b) would only cater to libraries using just numpy's python interface (and even there PyPy still has a lot of work to do).
Is any pypy dev interested in (a)? I really don't think there is, so we might continue discussing the road to take. For games and for a lot of matrix processing (AI, encription) numpy is good enough. Matplotlib could be easily ported to pypy, just by dropping some features (AGG and part of the freetype support code). FFT, blas and a lot of pieces of scipy just need a pointer to data só they should work (if there is a ctypes wrapper). Cython is being worked on so when that is ready parts of sage will just run, as everything else using cython. Why not move more of scipy to cython/ctypes? That is what you guys want for the future, and then it would not make anyone have to work on something they have no interest in. -- Leonardo Santagada

On Wed, Oct 19, 2011 at 6:47 PM, Leonardo Santagada <santagada@gmail.com> wrote:
Why not move more of scipy to cython/ctypes? That is what you guys want for the future, and then it would not make anyone have to work on something they have no interest in.
Independently of pypy's direction w.r.t. numpy, this will happen. My ideal for numpy/scipy would be pure C/fortran for existing libraries with almost every python C API call done through cython. This is already happening for most packages in scipy ecosystem who do not have the long history of numpy/scipy. cheers, David

On Wed, Oct 19, 2011 at 6:35 AM, Gary Robinson <garyrob@me.com> wrote:
Jacob Hall?n, 18.10.2011 18:41:
I'd just like to note that the compelling reason for PyPy to develop numpy support is popular demand. We did a survey last spring, in which an overwhelming number of people asked for numpy support. This indicates that there is a large group of people who will be reap benefits from using PyPy plus Numpy, without specific support for scipy packages.
Depends on what the question was. Many people say "NumPy", and when you ask back, you find out that they actually meant "SciPy" or at least "NumPy and parts x, y and z of its ecosystem that I commonly use…
I was one of the people who responded to that poll, and I have to say that I fall into the category "they actually meant 'SciPy'…". I assumed that there would be an interface to numpy that would also support scipy. SciPy has a lot of packages that run various things like SVD very, efficiently because it does them in C. I need access to those packages. I also write my own algorithms. For those, I want to benefit from PyPy's speed and don't necessarily want to make the algorithms fit into numpy's array-processing approach.
I suspect most of the poll respondents see NumPy as representing a lot more than just a fast ndarray-- i.e. that being able to type import numpy is the key to being able to tap the whole scientific Python ecosystem. I cannot be certain, but the I am willing to bet the percentage of people who use NumPy and NumPy only relative to the rest of the scientific Python community is very small.
So, I NEED SciPy, and would like to also have PyPy, and I'd like to use them together rather than having to separate everything into separate scripts, some of which use CPython/SciPy and some of which use PyPy. In fact, my current code doesn't need NumPy at all except as the way to get to SciPy.
So, I have to say, I am unhappy with the current PyPy approach to NumPy. I'd rather see a much slower NumPy/PyPy integration if that meant being able to use SciPy seamlessly with PyPy.
--
Gary Robinson CTO Emergent Discovery, LLC personal email: garyrob@me.com work email: grobinson@emergentdiscovery.com Company: http://www.emergentdiscovery.com Blog: http://www.garyrobinson.net
_______________________________________________ pypy-dev mailing list pypy-dev@python.org http://mail.python.org/mailman/listinfo/pypy-dev

On 10/19/2011 8:32 AM, Wes McKinney wrote:
On Wed, Oct 19, 2011 at 6:35 AM, Gary Robinson<garyrob@me.com> wrote:
Jacob Hall?n, 18.10.2011 18:41:
I'd just like to note that the compelling reason for PyPy to develop numpy support is popular demand. We did a survey last spring, in which an overwhelming number of people asked for numpy support. This indicates that there is a large group of people who will be reap benefits from using PyPy plus Numpy, without specific support for scipy packages.
Depends on what the question was. Many people say "NumPy", and when you ask back, you find out that they actually meant "SciPy" or at least "NumPy and parts x, y and z of its ecosystem that I commonly use…
I was one of the people who responded to that poll, and I have to say that I fall into the category "they actually meant 'SciPy'…". I assumed that there would be an interface to numpy that would also support scipy. SciPy has a lot of packages that run various things like SVD very, efficiently because it does them in C. I need access to those packages. I also write my own algorithms. For those, I want to benefit from PyPy's speed and don't necessarily want to make the algorithms fit into numpy's array-processing approach.
I suspect most of the poll respondents see NumPy as representing a lot more than just a fast ndarray-- i.e. that being able to type
import numpy
Or a misunderstanding how much the other packages rely on low-level functions. There might have been a perception that for a good chunk of the supporting libraries required only the numpy API (Python side API). My guess, there has to be some packages that only require numpy on the Python side and none of the foreign interfaces. I am surprised that MPL is mentioned requiring the low-level interfaces and not simply the num arrays API. If the pypy team is successful with numpypy, I would guess some packages would work (naive guess?) with minimal modification? My preference would be to have full support (numpy/scipy/matplotlib) but because of the success of pypy on MyHDL I would be happy (extremely happy) with fast running numpy without scipy. Regards Chris

I was one of the people who responded to that poll, and I have to say that I fall into the category "they actually meant 'SciPy'…".
I'll note with regards to the survey that I also recall saying Yes to numpy but never thinking to explain that I used SciPy, the SciKits and Cython for a lot of my work (not all of it but definitely for chunks of it). Maybe a second more focused survey would be useful? Ian. -- Ian Ozsvald (A.I. researcher) ian@IanOzsvald.com http://IanOzsvald.com http://MorConsulting.com/ http://StrongSteam.com/ http://SocialTiesApp.com/ http://TheScreencastingHandbook.com http://FivePoundApp.com/ http://twitter.com/IanOzsvald

On Thu, Oct 20, 2011 at 12:33 PM, Ian Ozsvald <ian@ianozsvald.com> wrote:
I was one of the people who responded to that poll, and I have to say that I fall into the category "they actually meant 'SciPy'…".
I'll note with regards to the survey that I also recall saying Yes to numpy but never thinking to explain that I used SciPy, the SciKits and Cython for a lot of my work (not all of it but definitely for chunks of it). Maybe a second more focused survey would be useful?
I think Armin made it clear enough but apparently not. We're not against scipy and we will try our best at supporting it. However it's not in the first part of the proposal - let's be reasonable, pypy is not magic, we can't make everything happen at the same time. We believe that emulating CPython C API is a lot of pain and numpy does not adhere to it anyway. We also see how cython is not the central part of numpy right now and it's unclear whether cython bindings would every be done as the basis of numpy array. How would you do that anyway? So providing a basic, working and preferably fast array type is an absolute necessity to go forward. We don't want to plan upfront what then. We also think providing the array type *has* to break backwards compatibility or it'll be a major pain to implement, simply because CPython is too different. And, as a value added, fast operations on low-level data *in python* while not a priority for a lot of scipy people is a priority for a lot of pypy people - it's just very useful. If you have a plan how to go forward *and* immediately get scipy, please speak up, I don't. Cheers, fijal
Ian.
-- Ian Ozsvald (A.I. researcher) ian@IanOzsvald.com
http://IanOzsvald.com http://MorConsulting.com/ http://StrongSteam.com/ http://SocialTiesApp.com/ http://TheScreencastingHandbook.com http://FivePoundApp.com/ http://twitter.com/IanOzsvald _______________________________________________ pypy-dev mailing list pypy-dev@python.org http://mail.python.org/mailman/listinfo/pypy-dev

I agree that numpy support is a good first aim, I hope it'll open the door to scipy support later. To that end I've made my donation. As discussed with Fijal via a private email I felt awkward with the new project (hence me asking the question 60 emails back) as I'd offered a £600 donation which was made on the assumption that numpy+scipy support would be possible (and to be clear - this was entirely *my* assumption, made at EuroPython, before the project was defined - the error was mine). Obviously I want to see numpy supported, I do also want to see scipy (and probably cython) supported too. So, I've just donated $480USD (£300) for the numpy-pypy project as a personal donation. I'll make a second donation of $480 as and when a project is proposed that enables scipy support. This fits with my goals and I hope it helps the project move forwards. Cheers all, Ian. On 20 October 2011 11:41, Maciej Fijalkowski <fijall@gmail.com> wrote:
On Thu, Oct 20, 2011 at 12:33 PM, Ian Ozsvald <ian@ianozsvald.com> wrote:
I was one of the people who responded to that poll, and I have to say that I fall into the category "they actually meant 'SciPy'…".
I'll note with regards to the survey that I also recall saying Yes to numpy but never thinking to explain that I used SciPy, the SciKits and Cython for a lot of my work (not all of it but definitely for chunks of it). Maybe a second more focused survey would be useful?
I think Armin made it clear enough but apparently not. We're not against scipy and we will try our best at supporting it. However it's not in the first part of the proposal - let's be reasonable, pypy is not magic, we can't make everything happen at the same time.
We believe that emulating CPython C API is a lot of pain and numpy does not adhere to it anyway. We also see how cython is not the central part of numpy right now and it's unclear whether cython bindings would every be done as the basis of numpy array. How would you do that anyway?
So providing a basic, working and preferably fast array type is an absolute necessity to go forward. We don't want to plan upfront what then. We also think providing the array type *has* to break backwards compatibility or it'll be a major pain to implement, simply because CPython is too different. And, as a value added, fast operations on low-level data *in python* while not a priority for a lot of scipy people is a priority for a lot of pypy people - it's just very useful.
If you have a plan how to go forward *and* immediately get scipy, please speak up, I don't.
Cheers, fijal
Ian.
-- Ian Ozsvald (A.I. researcher) ian@IanOzsvald.com
http://IanOzsvald.com http://MorConsulting.com/ http://StrongSteam.com/ http://SocialTiesApp.com/ http://TheScreencastingHandbook.com http://FivePoundApp.com/ http://twitter.com/IanOzsvald _______________________________________________ pypy-dev mailing list pypy-dev@python.org http://mail.python.org/mailman/listinfo/pypy-dev
-- Ian Ozsvald (A.I. researcher) ian@IanOzsvald.com http://IanOzsvald.com http://MorConsulting.com/ http://StrongSteam.com/ http://SocialTiesApp.com/ http://TheScreencastingHandbook.com http://FivePoundApp.com/ http://twitter.com/IanOzsvald

Donation is in here too... numpy is just the beginning step in a great direction. Go pypy! On Sun, Oct 23, 2011 at 12:24 PM, Ian Ozsvald <ian@ianozsvald.com> wrote:
I agree that numpy support is a good first aim, I hope it'll open the door to scipy support later.
To that end I've made my donation. As discussed with Fijal via a private email I felt awkward with the new project (hence me asking the question 60 emails back) as I'd offered a £600 donation which was made on the assumption that numpy+scipy support would be possible (and to be clear - this was entirely *my* assumption, made at EuroPython, before the project was defined - the error was mine). Obviously I want to see numpy supported, I do also want to see scipy (and probably cython) supported too.
So, I've just donated $480USD (£300) for the numpy-pypy project as a personal donation. I'll make a second donation of $480 as and when a project is proposed that enables scipy support. This fits with my goals and I hope it helps the project move forwards.
Cheers all, Ian.
On Thu, Oct 20, 2011 at 12:33 PM, Ian Ozsvald <ian@ianozsvald.com> wrote:
I was one of the people who responded to that poll, and I have to say
On 20 October 2011 11:41, Maciej Fijalkowski <fijall@gmail.com> wrote: that I fall into the category "they actually meant 'SciPy'…".
I'll note with regards to the survey that I also recall saying Yes to numpy but never thinking to explain that I used SciPy, the SciKits and Cython for a lot of my work (not all of it but definitely for chunks of it). Maybe a second more focused survey would be useful?
I think Armin made it clear enough but apparently not. We're not against scipy and we will try our best at supporting it. However it's not in the first part of the proposal - let's be reasonable, pypy is not magic, we can't make everything happen at the same time.
We believe that emulating CPython C API is a lot of pain and numpy does not adhere to it anyway. We also see how cython is not the central part of numpy right now and it's unclear whether cython bindings would every be done as the basis of numpy array. How would you do that anyway?
So providing a basic, working and preferably fast array type is an absolute necessity to go forward. We don't want to plan upfront what then. We also think providing the array type *has* to break backwards compatibility or it'll be a major pain to implement, simply because CPython is too different. And, as a value added, fast operations on low-level data *in python* while not a priority for a lot of scipy people is a priority for a lot of pypy people - it's just very useful.
If you have a plan how to go forward *and* immediately get scipy, please speak up, I don't.
Cheers, fijal
Ian.
-- Ian Ozsvald (A.I. researcher) ian@IanOzsvald.com
http://IanOzsvald.com http://MorConsulting.com/ http://StrongSteam.com/ http://SocialTiesApp.com/ http://TheScreencastingHandbook.com http://FivePoundApp.com/ http://twitter.com/IanOzsvald _______________________________________________ pypy-dev mailing list pypy-dev@python.org http://mail.python.org/mailman/listinfo/pypy-dev
-- Ian Ozsvald (A.I. researcher) ian@IanOzsvald.com
http://IanOzsvald.com http://MorConsulting.com/ http://StrongSteam.com/ http://SocialTiesApp.com/ http://TheScreencastingHandbook.com http://FivePoundApp.com/ http://twitter.com/IanOzsvald _______________________________________________ pypy-dev mailing list pypy-dev@python.org http://mail.python.org/mailman/listinfo/pypy-dev

On Sun, Oct 23, 2011 at 6:24 PM, Ian Ozsvald <ian@ianozsvald.com> wrote:
I agree that numpy support is a good first aim, I hope it'll open the door to scipy support later.
To that end I've made my donation. As discussed with Fijal via a private email I felt awkward with the new project (hence me asking the question 60 emails back) as I'd offered a £600 donation which was made on the assumption that numpy+scipy support would be possible (and to be clear - this was entirely *my* assumption, made at EuroPython, before the project was defined - the error was mine). Obviously I want to see numpy supported, I do also want to see scipy (and probably cython) supported too.
So, I've just donated $480USD (£300) for the numpy-pypy project as a personal donation. I'll make a second donation of $480 as and when a project is proposed that enables scipy support. This fits with my goals and I hope it helps the project move forwards.
Cheers all, Ian.
Thanks Ian for the donation! It makes perfect sense.
On 20 October 2011 11:41, Maciej Fijalkowski <fijall@gmail.com> wrote:
On Thu, Oct 20, 2011 at 12:33 PM, Ian Ozsvald <ian@ianozsvald.com> wrote:
I was one of the people who responded to that poll, and I have to say that I fall into the category "they actually meant 'SciPy'…".
I'll note with regards to the survey that I also recall saying Yes to numpy but never thinking to explain that I used SciPy, the SciKits and Cython for a lot of my work (not all of it but definitely for chunks of it). Maybe a second more focused survey would be useful?
I think Armin made it clear enough but apparently not. We're not against scipy and we will try our best at supporting it. However it's not in the first part of the proposal - let's be reasonable, pypy is not magic, we can't make everything happen at the same time.
We believe that emulating CPython C API is a lot of pain and numpy does not adhere to it anyway. We also see how cython is not the central part of numpy right now and it's unclear whether cython bindings would every be done as the basis of numpy array. How would you do that anyway?
So providing a basic, working and preferably fast array type is an absolute necessity to go forward. We don't want to plan upfront what then. We also think providing the array type *has* to break backwards compatibility or it'll be a major pain to implement, simply because CPython is too different. And, as a value added, fast operations on low-level data *in python* while not a priority for a lot of scipy people is a priority for a lot of pypy people - it's just very useful.
If you have a plan how to go forward *and* immediately get scipy, please speak up, I don't.
Cheers, fijal
Ian.
-- Ian Ozsvald (A.I. researcher) ian@IanOzsvald.com
http://IanOzsvald.com http://MorConsulting.com/ http://StrongSteam.com/ http://SocialTiesApp.com/ http://TheScreencastingHandbook.com http://FivePoundApp.com/ http://twitter.com/IanOzsvald _______________________________________________ pypy-dev mailing list pypy-dev@python.org http://mail.python.org/mailman/listinfo/pypy-dev
-- Ian Ozsvald (A.I. researcher) ian@IanOzsvald.com
http://IanOzsvald.com http://MorConsulting.com/ http://StrongSteam.com/ http://SocialTiesApp.com/ http://TheScreencastingHandbook.com http://FivePoundApp.com/ http://twitter.com/IanOzsvald
participants (11)
-
Antonio Cuni
-
Chris Wj
-
Christopher Felton
-
David Cournapeau
-
Gary Robinson
-
holger krekel
-
Ian Ozsvald
-
Leonardo Santagada
-
Maciej Fijalkowski
-
Peter Cock
-
Wes McKinney