Hi All, Just thought it was time to start discussing a release schedule for numpy 2.0 so we have something to aim at. I'm thinking sometime in the period April-June might be appropriate. There is a lot coming with the next release: the Enthought's numpy refactoring, Mark's float16 and iterator work, and support for IronPython. How do things look to the folks involved in those projects? Chuck
On Jan 25, 2011, at 10:42 AM, Charles R Harris wrote:
Hi All,
Just thought it was time to start discussing a release schedule for numpy 2.0 so we have something to aim at. I'm thinking sometime in the period April-June might be appropriate. There is a lot coming with the next release: the Enthought's numpy refactoring, Mark's float16 and iterator work, and support for IronPython. How do things look to the folks involved in those projects?
I would target June / July at this point ;-) I know I deserve a "I told you so" from Chuck --- I will take it. There is a bit of work that Mark is doing that would be good to include, also some modifications to the re-factoring that will support better small array performance. It may make sense for a NumPy 1.6 to come out in March / April in the interim. Thoughts? -Travis
Chuck _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
--- Travis Oliphant Enthought, Inc. oliphant@enthought.com 1-512-536-1057 http://www.enthought.com
On Tue, Jan 25, 2011 at 1:13 PM, Travis Oliphant <oliphant@enthought.com>wrote:
On Jan 25, 2011, at 10:42 AM, Charles R Harris wrote:
Hi All,
Just thought it was time to start discussing a release schedule for numpy 2.0 so we have something to aim at. I'm thinking sometime in the period April-June might be appropriate. There is a lot coming with the next release: the Enthought's numpy refactoring, Mark's float16 and iterator work, and support for IronPython. How do things look to the folks involved in those projects?
I would target June / July at this point ;-) I know I deserve a "I told you so" from Chuck --- I will take it.
How much remains to get done?
There is a bit of work that Mark is doing that would be good to include, also some modifications to the re-factoring that will support better small array performance.
Not everything needs to go into first release as long as the following releases are backward compatible. So the ABI needs it's final form as soon as possible. Is it still in flux?
It may make sense for a NumPy 1.6 to come out in March / April in the interim.
Pulling out the changes to attain backward compatibility isn't getting any easier. I'd rather shoot for 2.0 in June. What can the rest of us do to help move things along? Chuck
On Tue, Jan 25, 2011 at 5:18 PM, Charles R Harris <charlesr.harris@gmail.com
wrote:
On Tue, Jan 25, 2011 at 1:13 PM, Travis Oliphant <oliphant@enthought.com>wrote:
On Jan 25, 2011, at 10:42 AM, Charles R Harris wrote:
Hi All,
Just thought it was time to start discussing a release schedule for numpy 2.0 so we have something to aim at. I'm thinking sometime in the period April-June might be appropriate. There is a lot coming with the next release: the Enthought's numpy refactoring, Mark's float16 and iterator work, and support for IronPython. How do things look to the folks involved in those projects?
My suggestion is to do a 1.6 relatively soon, as the current trunk feels pretty stable to me, and it would be nice to release the features without having to go through the whole merging process.
I would target June / July at this point ;-) I know I deserve a "I told you so" from Chuck --- I will take it.
How much remains to get done?
My changes probably make merging the refactor more challenging too.
There is a bit of work that Mark is doing that would be good to include, also some modifications to the re-factoring that will support better small array performance.
Not everything needs to go into first release as long as the following releases are backward compatible. So the ABI needs it's final form as soon as possible. Is it still in flux?
I would suggest it is - there are a number of things I think could be improved in it, and it would be nice to bake in the underlying support features to make lazy/deferred evaluation of array expressions possible.
It may make sense for a NumPy 1.6 to come out in March / April in the
interim.
Pulling out the changes to attain backward compatibility isn't getting any easier. I'd rather shoot for 2.0 in June. What can the rest of us do to help move things along?
I took a shot at fixing the ABI compatibility, and if PyArray_ArrFunc was the main issue, then that might be done. An ABI compatible 1.6 with the datetime and half types should be doable, just some extensions might get confused if they encounter arrays made with the new data types. -Mark
On Wed, Jan 26, 2011 at 12:28 PM, Mark Wiebe <mwwiebe@gmail.com> wrote:
On Tue, Jan 25, 2011 at 5:18 PM, Charles R Harris < charlesr.harris@gmail.com> wrote:
On Tue, Jan 25, 2011 at 1:13 PM, Travis Oliphant <oliphant@enthought.com>wrote:
It may make sense for a NumPy 1.6 to come out in March / April in the interim.
Pulling out the changes to attain backward compatibility isn't getting any easier. I'd rather shoot for 2.0 in June. What can the rest of us do to help move things along?
Focusing on 2.0 makes sense to me too. Besides that, March/April is bad timing for me so someone else should volunteer to be the release manager if we go for a 1.6. I took a shot at fixing the ABI compatibility, and if PyArray_ArrFunc was
the main issue, then that might be done. An ABI compatible 1.6 with the datetime and half types should be doable, just some extensions might get confused if they encounter arrays made with the new data types.
Even if you fixed the ABI incompatibility (I don't know enough about the issue to confirm that), I'm not sure how much value there is in a release with as main new feature two dtypes that are not going to work well with scipy/other binaries compiled against 1.5.
Cheers, Ralf
On Wed, Jan 26, 2011 at 2:23 AM, Ralf Gommers <ralf.gommers@googlemail.com>wrote:
On Wed, Jan 26, 2011 at 12:28 PM, Mark Wiebe <mwwiebe@gmail.com> wrote:
On Tue, Jan 25, 2011 at 5:18 PM, Charles R Harris < charlesr.harris@gmail.com> wrote:
On Tue, Jan 25, 2011 at 1:13 PM, Travis Oliphant <oliphant@enthought.com
wrote:
It may make sense for a NumPy 1.6 to come out in March / April in the interim.
Pulling out the changes to attain backward compatibility isn't getting any easier. I'd rather shoot for 2.0 in June. What can the rest of us do to help move things along?
Focusing on 2.0 makes sense to me too. Besides that, March/April is bad timing for me so someone else should volunteer to be the release manager if we go for a 1.6.
I think sooner than March/April might be a possibility. I've gotten the ABI working so this succeeds on my machine: * Build SciPy against NumPy 1.5.1 * Build NumPy trunk * Run NumPy trunk with the 1.5.1-built SciPy - all tests pass except for one (PIL image resize, which tests all float types and half lacks the precisions necessary) I took a shot at fixing the ABI compatibility, and if PyArray_ArrFunc was
the main issue, then that might be done. An ABI compatible 1.6 with the datetime and half types should be doable, just some extensions might get confused if they encounter arrays made with the new data types.
Even if you fixed the ABI incompatibility (I don't know enough about the issue to confirm that), I'm not sure how much value there is in a release with as main new feature two dtypes that are not going to work well with scipy/other binaries compiled against 1.5.
I've recently gotten the faster ufunc NEP implementation finished except for generalized ufuncs, and most things work the same or faster with it. Below are some timings of 1.5.1 vs the new_iterator branch. In particular, the overhead on small arrays hasn't gotten worse, but the output memory layout speeds up some operations by a lot. To exercise the iterator a bit, and try to come up with a better approach than the generalized ufuncs, I came up with a new function, 'einsum' for the Einstein summation convention. I'll send another email about it, but it for instance solves the problem discussed here: http://mail.scipy.org/pipermail/numpy-discussion/2006-May/020506.html as "c = np.einsum('rij,rjk->rik', a, b)" -Mark The timings: In [1]: import numpy as np In [2]: np.version.version Out[2]: '1.5.1' In [3]: a = np.arange(9.).reshape(3,3); b = a.copy() In [4]: timeit a + b 100000 loops, best of 3: 3.48 us per loop In [5]: timeit 2 * a 100000 loops, best of 3: 6.07 us per loop In [6]: timeit np.sum(a) 100000 loops, best of 3: 7.19 us per loop In [7]: a = np.arange(1000000).reshape(100,100,100); b = a.copy() In [8]: timeit a + b 100 loops, best of 3: 17.1 ms per loop In [9]: a = np.arange(1920*1080*3).reshape(1080,1920,3).swapaxes(0,1) In [10]: timeit a * a 1 loops, best of 3: 794 ms per loop In [1]: import numpy as np In [2]: np.version.version Out[2]: '2.0.0.dev-c97e9d5' In [3]: a = np.arange(9.).reshape(3,3); b = a.copy() In [4]: timeit a + b 100000 loops, best of 3: 3.24 us per loop In [5]: timeit 2 * a 100000 loops, best of 3: 6.12 us per loop In [6]: timeit np.sum(a) 100000 loops, best of 3: 6.6 us per loop In [7]: a = np.arange(1000000).reshape(100,100,100); b = a.copy() In [8]: timeit a + b 100 loops, best of 3: 17 ms per loop In [9]: a = np.arange(1920*1080*3).reshape(1080,1920,3).swapaxes(0,1) In [10]: timeit a * a 10 loops, best of 3: 116 ms per loop
On Wed, Jan 26, 2011 at 1:10 PM, Mark Wiebe <mwwiebe@gmail.com> wrote:
On Wed, Jan 26, 2011 at 2:23 AM, Ralf Gommers <ralf.gommers@googlemail.com
wrote:
On Wed, Jan 26, 2011 at 12:28 PM, Mark Wiebe <mwwiebe@gmail.com> wrote:
On Tue, Jan 25, 2011 at 5:18 PM, Charles R Harris < charlesr.harris@gmail.com> wrote:
On Tue, Jan 25, 2011 at 1:13 PM, Travis Oliphant < oliphant@enthought.com> wrote:
It may make sense for a NumPy 1.6 to come out in March / April in the interim.
Pulling out the changes to attain backward compatibility isn't getting any easier. I'd rather shoot for 2.0 in June. What can the rest of us do to help move things along?
Focusing on 2.0 makes sense to me too. Besides that, March/April is bad timing for me so someone else should volunteer to be the release manager if we go for a 1.6.
I think sooner than March/April might be a possibility. I've gotten the ABI working so this succeeds on my machine:
If we go with a 1.6 I have some polynomial stuff I want to put in, probably a weekend or two of work, and there are tickets and pull requests to look through, so to me March-April looks like a good time. It sounds like Ralf has stuff scheduled for the rest of the spring after the scipy release. IIRC, there was at least one other person interested in managing a release when David left for Silveregg, do we have any volunteers for a 1.6? If we do go for 1.6 I would like to keep 2.0 in sight. If datetime, the new iterator, einsum, and float16 are in 1.6 then 2.0 looks more like a cleanup the library/inteface and support IronPython release and there isn't as much pressure to get it out soon. Also it is important to get the ABI right so we don't need to change it again soon and doing that might take a bit of trial and error. Does September seem reasonable? * Build SciPy against NumPy 1.5.1
* Build NumPy trunk * Run NumPy trunk with the 1.5.1-built SciPy - all tests pass except for one (PIL image resize, which tests all float types and half lacks the precisions necessary)
I took a shot at fixing the ABI compatibility, and if PyArray_ArrFunc was
the main issue, then that might be done. An ABI compatible 1.6 with the datetime and half types should be doable, just some extensions might get confused if they encounter arrays made with the new data types.
Even if you fixed the ABI incompatibility (I don't know enough about the issue to confirm that), I'm not sure how much value there is in a release with as main new feature two dtypes that are not going to work well with scipy/other binaries compiled against 1.5.
I've recently gotten the faster ufunc NEP implementation finished except for generalized ufuncs, and most things work the same or faster with it. Below are some timings of 1.5.1 vs the new_iterator branch. In particular, the overhead on small arrays hasn't gotten worse, but the output memory layout speeds up some operations by a lot.
<snip> Chuck
On Thu, Jan 27, 2011 at 11:09 AM, Charles R Harris < charlesr.harris@gmail.com> wrote:
On Wed, Jan 26, 2011 at 1:10 PM, Mark Wiebe <mwwiebe@gmail.com> wrote:
On Wed, Jan 26, 2011 at 2:23 AM, Ralf Gommers < ralf.gommers@googlemail.com> wrote:
On Wed, Jan 26, 2011 at 12:28 PM, Mark Wiebe <mwwiebe@gmail.com> wrote:
On Tue, Jan 25, 2011 at 5:18 PM, Charles R Harris < charlesr.harris@gmail.com> wrote:
On Tue, Jan 25, 2011 at 1:13 PM, Travis Oliphant < oliphant@enthought.com> wrote:
It may make sense for a NumPy 1.6 to come out in March / April in the interim.
Pulling out the changes to attain backward compatibility isn't getting any easier. I'd rather shoot for 2.0 in June. What can the rest of us do to help move things along?
Focusing on 2.0 makes sense to me too. Besides that, March/April is bad timing for me so someone else should volunteer to be the release manager if we go for a 1.6.
I think sooner than March/April might be a possibility. I've gotten the ABI working so this succeeds on my machine:
If we go with a 1.6 I have some polynomial stuff I want to put in, probably a weekend or two of work, and there are tickets and pull requests to look through, so to me March-April looks like a good time. It sounds like Ralf has stuff scheduled for the rest of the spring after the scipy release. IIRC, there was at least one other person interested in managing a release when David left for Silveregg, do we have any volunteers for a 1.6?
If we do go for 1.6 I would like to keep 2.0 in sight. If datetime, the new iterator, einsum, and float16 are in 1.6 then 2.0 looks more like a cleanup the library/inteface and support IronPython release and there isn't as much pressure to get it out soon. Also it is important to get the ABI right so we don't need to change it again soon and doing that might take a bit of trial and error. Does September seem reasonable?
* Build SciPy against NumPy 1.5.1
* Build NumPy trunk * Run NumPy trunk with the 1.5.1-built SciPy - all tests pass except for one (PIL image resize, which tests all float types and half lacks the precisions necessary)
The PIL test can still be fixed before the final 0.9.0 release, it looks like we will need another RC anyway. Does anyone have time for this in the next few days?
I took a shot at fixing the ABI compatibility, and if PyArray_ArrFunc was
the main issue, then that might be done. An ABI compatible 1.6 with the datetime and half types should be doable, just some extensions might get confused if they encounter arrays made with the new data types.
Even if you fixed the ABI incompatibility (I don't know enough about the issue to confirm that), I'm not sure how much value there is in a release with as main new feature two dtypes that are not going to work well with scipy/other binaries compiled against 1.5.
I've recently gotten the faster ufunc NEP implementation finished except for generalized ufuncs, and most things work the same or faster with it. Below are some timings of 1.5.1 vs the new_iterator branch. In particular, the overhead on small arrays hasn't gotten worse, but the output memory layout speeds up some operations by a lot.
Your new additions indeed look quite promising. I tried your new_iterator
branch but ran into a segfault immediately on running the tests on OS X. I opened a ticket for it, to not mix it into this discussion about releases too much: http://projects.scipy.org/numpy/ticket/1724. Before we decide on a 1.6 release I would suggest to do at least the following: - review of ABI fixes by someone very familiar with the problem that occurred in 1.4.0 (David, Pauli, Charles?) - test on Linux, OS X and Windows 32-bit and 64-bit. Also with an MSVC build on Windows, since that exposes more issues each release. Cheers, Ralf
On Thu, Jan 27, 2011 at 7:09 AM, Ralf Gommers <ralf.gommers@googlemail.com>wrote:
<snip> The PIL test can still be fixed before the final 0.9.0 release, it looks like we will need another RC anyway. Does anyone have time for this in the next few days?
I've attached a patch which fixes it for me.
I took a shot at fixing the ABI compatibility, and if PyArray_ArrFunc was
the main issue, then that might be done. An ABI compatible 1.6 with the datetime and half types should be doable, just some extensions might get confused if they encounter arrays made with the new data types.
Even if you fixed the ABI incompatibility (I don't know enough about the issue to confirm that), I'm not sure how much value there is in a release with as main new feature two dtypes that are not going to work well with scipy/other binaries compiled against 1.5.
I've recently gotten the faster ufunc NEP implementation finished except for generalized ufuncs, and most things work the same or faster with it. Below are some timings of 1.5.1 vs the new_iterator branch. In particular, the overhead on small arrays hasn't gotten worse, but the output memory layout speeds up some operations by a lot.
Your new additions indeed look quite promising. I tried your new_iterator branch but ran into a segfault immediately on running the tests on OS X. I opened a ticket for it, to not mix it into this discussion about releases too much: http://projects.scipy.org/numpy/ticket/1724.
Is that a non-Intel platform? While I tried to get aligned access right, it's likely there's a bug in it somewhere. Before we decide on a 1.6 release I would suggest to do at least the
following: - review of ABI fixes by someone very familiar with the problem that occurred in 1.4.0 (David, Pauli, Charles?) - test on Linux, OS X and Windows 32-bit and 64-bit. Also with an MSVC build on Windows, since that exposes more issues each release.
All tests pass for me now, maybe it's a good time to merge the branch into the trunk so we can run it on the buildbot? -Mark
On Thu, Jan 27, 2011 at 9:17 AM, Mark Wiebe <mwwiebe@gmail.com> wrote:
On Thu, Jan 27, 2011 at 7:09 AM, Ralf Gommers <ralf.gommers@googlemail.com
wrote:
<snip> The PIL test can still be fixed before the final 0.9.0 release, it looks like we will need another RC anyway. Does anyone have time for this in the next few days?
I've attached a patch which fixes it for me.
I took a shot at fixing the ABI compatibility, and if PyArray_ArrFunc
was the main issue, then that might be done. An ABI compatible 1.6 with the datetime and half types should be doable, just some extensions might get confused if they encounter arrays made with the new data types.
Even if you fixed the ABI incompatibility (I don't know enough about the issue to confirm that), I'm not sure how much value there is in a release with as main new feature two dtypes that are not going to work well with scipy/other binaries compiled against 1.5.
I've recently gotten the faster ufunc NEP implementation finished except for generalized ufuncs, and most things work the same or faster with it. Below are some timings of 1.5.1 vs the new_iterator branch. In particular, the overhead on small arrays hasn't gotten worse, but the output memory layout speeds up some operations by a lot.
Your new additions indeed look quite promising. I tried your new_iterator branch but ran into a segfault immediately on running the tests on OS X. I opened a ticket for it, to not mix it into this discussion about releases too much: http://projects.scipy.org/numpy/ticket/1724.
Is that a non-Intel platform? While I tried to get aligned access right, it's likely there's a bug in it somewhere.
Before we decide on a 1.6 release I would suggest to do at least the
following: - review of ABI fixes by someone very familiar with the problem that occurred in 1.4.0 (David, Pauli, Charles?) - test on Linux, OS X and Windows 32-bit and 64-bit. Also with an MSVC build on Windows, since that exposes more issues each release.
All tests pass for me now, maybe it's a good time to merge the branch into the trunk so we can run it on the buildbot?
Might be better to merge your unadulterated stuff into master, make a 1.6 branch, and add the compatibility fixes in the branch. You can test branches on the buildbot I think, at least that worked for svn, I haven't tried it with github. Chuck
On Thu, Jan 27, 2011 at 9:36 AM, Charles R Harris <charlesr.harris@gmail.com
wrote:
<snip>
All tests pass for me now, maybe it's a good time to merge the branch into the trunk so we can run it on the buildbot?
Might be better to merge your unadulterated stuff into master, make a 1.6 branch, and add the compatibility fixes in the branch. You can test branches on the buildbot I think, at least that worked for svn, I haven't tried it with github.
I'm inclined to put the ABI fixes in trunk as well for the time being. The two changes of note, moving the 'cast' array to the end of PyArray_ArrFuncs and making 'flags' in PyArray_Descr bigger, can be reapplied if the 2.0 refactor ends up needing them. I think for 2.0, more extensive future-proofing will be desirable anyway, so trunk may as well be ABI compatible until it's clear what changes are necessary. -Mark
My $0.02 on the NumPy 2.0 schedule: NumPy 2.0 is for ABI-incompatible changes like datetime support, and .NET support. It would be ideal, if at the same time we could future-proof the ABI some-what so that future changes can be made in an ABI-compatible way. I also think it would be a good idea to incorporate Mark's small-array improvements into the C-structure of NumPy arrays. If Mark has time to work on this, I have some hope we can get there. I have been wanting to propose a "generator array" for some time now, but have not had time to write it up. I have the outline of a design that overlaps but I think generalizes Mark's deferred arrays. Mark's deferred arrays would be a particular realization of the generator array, but other realizations are possible as well. There is much that has to be fleshed out for it to really work, and I think it will have to be in NumPy 2.0 because it will create ABI changes. I don't have the time to personally implement the design. If there are others out there that have the time, I would love to talk with them about it. However, I don't want to distract from this scheduling thread to discuss the ideas (I will post something else for that). The reason for a NumPy 1.6 suggestion, is that Mark (and others it would seem) have additional work and features that do not need to wait for the NumPy 2.0 ABI design to finalize in order to get out there. If someone is willing to manage the release of NumPy 1.6, then it sounds like a great idea to me. -Travis Basically the reason for On Jan 27, 2011, at 11:56 AM, Mark Wiebe wrote:
On Thu, Jan 27, 2011 at 9:36 AM, Charles R Harris <charlesr.harris@gmail.com> wrote: <snip> All tests pass for me now, maybe it's a good time to merge the branch into the trunk so we can run it on the buildbot?
Might be better to merge your unadulterated stuff into master, make a 1.6 branch, and add the compatibility fixes in the branch. You can test branches on the buildbot I think, at least that worked for svn, I haven't tried it with github.
I'm inclined to put the ABI fixes in trunk as well for the time being. The two changes of note, moving the 'cast' array to the end of PyArray_ArrFuncs and making 'flags' in PyArray_Descr bigger, can be reapplied if the 2.0 refactor ends up needing them. I think for 2.0, more extensive future-proofing will be desirable anyway, so trunk may as well be ABI compatible until it's clear what changes are necessary.
-Mark _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
--- Travis Oliphant Enthought, Inc. oliphant@enthought.com 1-512-536-1057 http://www.enthought.com
Hi, On Fri, Jan 28, 2011 at 7:15 AM, Travis Oliphant <oliphant@enthought.com> wrote:
The reason for a NumPy 1.6 suggestion, is that Mark (and others it would seem) have additional work and features that do not need to wait for the NumPy 2.0 ABI design to finalize in order to get out there. If someone is willing to manage the release of NumPy 1.6, then it sounds like a great idea to me.
This thread ended without a conclusion a month ago. Now I think master is in a better state than a month ago for a release (py 2.4/2.5/3.x issues and segfault on OS X fixed, more testing of changes), and I have a better idea of my free time for March/April. Basically, I have a good amount of time for the next couple of weeks, and not so much at the end of March / first half of April due to an inter-continental move. But I think we can get out a beta by mid-March, and I can manage the release. I've had a look at the bug tracker, here's a list of tickets for 1.6: #1748 (blocker: regression for astype('str')) #1619 (issue with dtypes, with patch) #1749 (distutils, py 3.2) #1601 (distutils, py 3.2) #1622 (Solaris segfault, with patch) #1713 (Solaris segfault) #1631 (Solaris segfault) I can look at the distutils tickets. The other thing that needs to be done is some (more) documentation of new features. Einsum and the new iterator seem to be well documented, but not described in the release notes. Datetime has no docs as far as I can see except for two similar NEPs. Proposed schedule: March 15: beta 1 March 28: rc 1 April 17: rc 2 (if needed) April 24: final release Let me know what you think. Bonus points for volunteering to fix some of those tickets:) Cheers, Ralf
On Mon, Feb 28, 2011 at 4:00 PM, Ralf Gommers <ralf.gommers@googlemail.com> wrote:
The other thing that needs to be done is some (more) documentation of new features. Einsum and the new iterator seem to be well documented, but not described in the release notes.
Hmm, I take that back. Just saw that np.newiter and its methods don't have any docstrings at all.
Mon, 28 Feb 2011 16:50:59 +0800, Ralf Gommers wrote:
On Mon, Feb 28, 2011 at 4:00 PM, Ralf Gommers <ralf.gommers@googlemail.com> wrote:
The other thing that needs to be done is some (more) documentation of new features. Einsum and the new iterator seem to be well documented, but not described in the release notes.
Hmm, I take that back. Just saw that np.newiter and its methods don't have any docstrings at all.
The new iterator C-API should also be documented (if it is supposed to be public), which is not AFAIK done yet.
I just want to say that I am looking forward to np.newiter, and I am impressed at how quickly it's being released. On Mon, Feb 28, 2011 at 2:49 AM, Pauli Virtanen <pav@iki.fi> wrote:
Mon, 28 Feb 2011 16:50:59 +0800, Ralf Gommers wrote:
On Mon, Feb 28, 2011 at 4:00 PM, Ralf Gommers <ralf.gommers@googlemail.com> wrote:
The other thing that needs to be done is some (more) documentation of new features. Einsum and the new iterator seem to be well documented, but not described in the release notes.
Hmm, I take that back. Just saw that np.newiter and its methods don't have any docstrings at all.
The new iterator C-API should also be documented (if it is supposed to be public), which is not AFAIK done yet.
_______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
On 02/28/2011 02:00 AM, Ralf Gommers wrote:
Hi,
The reason for a NumPy 1.6 suggestion, is that Mark (and others it would seem) have additional work and features that do not need to wait for the NumPy 2.0 ABI design to finalize in order to get out there. If someone is willing to manage the release of NumPy 1.6, then it sounds like a great idea to me. This thread ended without a conclusion a month ago. Now I think master is in a better state than a month ago for a release (py 2.4/2.5/3.x issues and segfault on OS X fixed, more testing of changes), and I have a better idea of my free time for March/April. Basically, I have a good amount of time for the next couple of weeks, and not so much at
On Fri, Jan 28, 2011 at 7:15 AM, Travis Oliphant<oliphant@enthought.com> wrote: the end of March / first half of April due to an inter-continental move. But I think we can get out a beta by mid-March, and I can manage the release.
I've had a look at the bug tracker, here's a list of tickets for 1.6: #1748 (blocker: regression for astype('str')) #1619 (issue with dtypes, with patch) #1749 (distutils, py 3.2) #1601 (distutils, py 3.2) #1622 (Solaris segfault, with patch) #1713 (Solaris segfault) #1631 (Solaris segfault)
I can look at the distutils tickets.
The other thing that needs to be done is some (more) documentation of new features. Einsum and the new iterator seem to be well documented, but not described in the release notes. Datetime has no docs as far as I can see except for two similar NEPs.
Proposed schedule: March 15: beta 1 March 28: rc 1 April 17: rc 2 (if needed) April 24: final release
Let me know what you think. Bonus points for volunteering to fix some of those tickets:)
Cheers, Ralf _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion Is this 1.6 or 2.0? The title is 2.0 but you talk about 1.6 so some tickets listed as 2.0 may apply to 1.6.
It would be great to do some 'housekeeping' and try to address some of the old tickets dealt with before numpy 2.0. For example, I think ticket 225 (bincount does not accept input of type > N.uint16) has been addressed but it needs to be checked from windows and 32-bit systems. Bruce Created 2006: #38 strides accepted as an argument to records.array #57 ufunc methods need improved BUFFER loop #213 SharedLibrary builder for numpy.distutils #225 bincount does not accept input of type > N.uint16 #236 reduceat cornercase #237 reduceat should handle outlier indices gracefully #244 Build fails with Intel Visual Fortran compiler #260 Add mechanism for registering objects to be deallocated and memory-to-be freed at Python exit #274 Speed up N-D Boolean indexing #301 power with negative argument returns 0 #333 Creating an array from a n-dim dtype type fails #338 Valgrind warning when calling scipy.interpolate.interp1d #349 Improve unit tests in linalg #354 Possible inconsistency in 0-dim and scalar empty array types #398 Compatibility loader for old Numeric pickles #400 C API access to fft for C scipy extension ? #402 newaxis incompatible with array indexing Numpy 1.0 #450 Make a.min() not copy data #417 Numpy 1.0.1 compilation fails on IRIX 6.5 #527 fortran linking flag option... #1176 deepcopy turns ndarry into string_ #1143 Improve performance of PyUFunc_Reduce #931 Records containing zero-length items pickle just fine, but cannot be unpickled #803 Assignment problem on matrix advanced selection Numpy 1.1 #1266 Extremely long runtimes in numpy.fft.fft #963 Object array comparisons eat exceptions #929 empty_like and zeros_like behave differently from ones_like #934 Documentation error in site.cfg.example Numpy 1.2 #1374 Ticket 628 not fixed for Solaris (polyfit uses 100% CPU and does not stop) #1209 Docstring for numpy.numarray.random_array.multinomial is out of date. #1192 integer dot product #1172 abs does not work with -maxint #1163 Incorrect conversion to Int64 by loadtxt (traced to _getconv in numpy.lib.io) #1161 Errors and/or wrong result with reverse slicing in numpy.delete #1094 masked array autotest fails with bus error #1085 Surprising results from in-place operations involving views #1071 loadtxt fails if the last column contains empty value
On Mon, Feb 28, 2011 at 8:36 AM, Bruce Southey <bsouthey@gmail.com> wrote:
On 02/28/2011 02:00 AM, Ralf Gommers wrote:
Hi,
The reason for a NumPy 1.6 suggestion, is that Mark (and others it would seem) have additional work and features that do not need to wait for the NumPy 2.0 ABI design to finalize in order to get out there. If someone is willing to manage the release of NumPy 1.6, then it sounds like a great idea to me. This thread ended without a conclusion a month ago. Now I think master is in a better state than a month ago for a release (py 2.4/2.5/3.x issues and segfault on OS X fixed, more testing of changes), and I have a better idea of my free time for March/April. Basically, I have a good amount of time for the next couple of weeks, and not so much at
On Fri, Jan 28, 2011 at 7:15 AM, Travis Oliphant<oliphant@enthought.com> wrote: the end of March / first half of April due to an inter-continental move. But I think we can get out a beta by mid-March, and I can manage the release.
I've had a look at the bug tracker, here's a list of tickets for 1.6: #1748 (blocker: regression for astype('str')) #1619 (issue with dtypes, with patch) #1749 (distutils, py 3.2) #1601 (distutils, py 3.2) #1622 (Solaris segfault, with patch) #1713 (Solaris segfault) #1631 (Solaris segfault)
I can look at the distutils tickets.
The other thing that needs to be done is some (more) documentation of new features. Einsum and the new iterator seem to be well documented, but not described in the release notes. Datetime has no docs as far as I can see except for two similar NEPs.
Proposed schedule: March 15: beta 1 March 28: rc 1 April 17: rc 2 (if needed) April 24: final release
Let me know what you think. Bonus points for volunteering to fix some of those tickets:)
Cheers, Ralf _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion Is this 1.6 or 2.0? The title is 2.0 but you talk about 1.6 so some tickets listed as 2.0 may apply to 1.6.
It would be great to do some 'housekeeping' and try to address some of the old tickets dealt with before numpy 2.0. For example, I think ticket 225 (bincount does not accept input of type > N.uint16) has been addressed but it needs to be checked from windows and 32-bit systems.
Bruce
Created 2006: #38 strides accepted as an argument to records.array #57 ufunc methods need improved BUFFER loop #213 SharedLibrary builder for numpy.distutils #225 bincount does not accept input of type > N.uint16 #236 reduceat cornercase #237 reduceat should handle outlier indices gracefully #244 Build fails with Intel Visual Fortran compiler #260 Add mechanism for registering objects to be deallocated and memory-to-be freed at Python exit #274 Speed up N-D Boolean indexing #301 power with negative argument returns 0 #333 Creating an array from a n-dim dtype type fails #338 Valgrind warning when calling scipy.interpolate.interp1d #349 Improve unit tests in linalg #354 Possible inconsistency in 0-dim and scalar empty array types #398 Compatibility loader for old Numeric pickles #400 C API access to fft for C scipy extension ? #402 newaxis incompatible with array indexing
Numpy 1.0 #450 Make a.min() not copy data #417 Numpy 1.0.1 compilation fails on IRIX 6.5 #527 fortran linking flag option... #1176 deepcopy turns ndarry into string_ #1143 Improve performance of PyUFunc_Reduce #931 Records containing zero-length items pickle just fine, but cannot be unpickled #803 Assignment problem on matrix advanced selection
Numpy 1.1 #1266 Extremely long runtimes in numpy.fft.fft #963 Object array comparisons eat exceptions #929 empty_like and zeros_like behave differently from ones_like #934 Documentation error in site.cfg.example
Numpy 1.2 #1374 Ticket 628 not fixed for Solaris (polyfit uses 100% CPU and does not stop) #1209 Docstring for numpy.numarray.random_array.multinomial is out of date. #1192 integer dot product #1172 abs does not work with -maxint #1163 Incorrect conversion to Int64 by loadtxt (traced to _getconv in numpy.lib.io) #1161 Errors and/or wrong result with reverse slicing in numpy.delete #1094 masked array autotest fails with bus error #1085 Surprising results from in-place operations involving views #1071 loadtxt fails if the last column contains empty value
So, is there still no hope in addressing this old bug report of mine? http://projects.scipy.org/numpy/ticket/1562 Ben Root
On 02/28/2011 09:02 AM, Benjamin Root wrote: [snip]
So, is there still no hope in addressing this old bug report of mine?
http://projects.scipy.org/numpy/ticket/1562
Ben Root
I think you need to add more details to this. So do you have an example of the problem that includes code and expected output? Perhaps genfromtxt is probably more appropriate than loadtxt for what you want: from StringIO import StringIO import numpy as np t = StringIO("1,1.3,abcde\n2,2.3,wxyz\n1\n3,3.3,mnop") data = np.genfromtxt(t, [('myint','i8'),('myfloat','f8'),('mystring','S5')], names = ['myint','myfloat','mystring'], delimiter=",", invalid_raise=False) print 'Bad data raise\n',data This gives the output that skips the incomplete 3rd line: /usr/lib64/python2.7/site-packages/numpy/lib/npyio.py:1507: ConversionWarning: Some errors were detected ! Line #3 (got 1 columns instead of 3) warnings.warn(errmsg, ConversionWarning) Bad data raise [(1, 1.3, 'abcde') (2, 2.3, 'wxyz') (3, 3.3, 'mnop')] Bruce
On Mon, Feb 28, 2011 at 9:25 AM, Bruce Southey <bsouthey@gmail.com> wrote:
On 02/28/2011 09:02 AM, Benjamin Root wrote: [snip]
So, is there still no hope in addressing this old bug report of mine?
http://projects.scipy.org/numpy/ticket/1562
Ben Root
I think you need to add more details to this. So do you have an example of the problem that includes code and expected output?
Perhaps genfromtxt is probably more appropriate than loadtxt for what you want:
from StringIO import StringIO import numpy as np t = StringIO("1,1.3,abcde\n2,2.3,wxyz\n1\n3,3.3,mnop") data = np.genfromtxt(t, [('myint','i8'),('myfloat','f8'),('mystring','S5')], names = ['myint','myfloat','mystring'], delimiter=",", invalid_raise=False) print 'Bad data raise\n',data
This gives the output that skips the incomplete 3rd line:
/usr/lib64/python2.7/site-packages/numpy/lib/npyio.py:1507: ConversionWarning: Some errors were detected ! Line #3 (got 1 columns instead of 3) warnings.warn(errmsg, ConversionWarning) Bad data raise [(1, 1.3, 'abcde') (2, 2.3, 'wxyz') (3, 3.3, 'mnop')]
Bruce
Bruce, I think you mis-understood the problem I was reporting. You can find the discussion thread here: http://www.mail-archive.com/numpy-discussion@scipy.org/msg26235.html I have proposed that at the very least, an example of this problem is added to the documentation of loadtxt so that users know to be aware of this possibility. In addition, loadtxt fails on empty files even when provided with a dtype. I believe genfromtxt also fails as well in this case. Ben Root
On 02/28/2011 09:47 AM, Benjamin Root wrote:
On Mon, Feb 28, 2011 at 9:25 AM, Bruce Southey <bsouthey@gmail.com <mailto:bsouthey@gmail.com>> wrote:
On 02/28/2011 09:02 AM, Benjamin Root wrote: [snip] > > > So, is there still no hope in addressing this old bug report of mine? > > http://projects.scipy.org/numpy/ticket/1562 > > Ben Root > I think you need to add more details to this. So do you have an example of the problem that includes code and expected output?
Perhaps genfromtxt is probably more appropriate than loadtxt for what you want:
from StringIO import StringIO import numpy as np t = StringIO("1,1.3,abcde\n2,2.3,wxyz\n1\n3,3.3,mnop") data = np.genfromtxt(t, [('myint','i8'),('myfloat','f8'),('mystring','S5')], names = ['myint','myfloat','mystring'], delimiter=",", invalid_raise=False) print 'Bad data raise\n',data
This gives the output that skips the incomplete 3rd line:
/usr/lib64/python2.7/site-packages/numpy/lib/npyio.py:1507: ConversionWarning: Some errors were detected ! Line #3 (got 1 columns instead of 3) warnings.warn(errmsg, ConversionWarning) Bad data raise [(1, 1.3, 'abcde') (2, 2.3, 'wxyz') (3, 3.3, 'mnop')]
Bruce
Bruce,
I think you mis-understood the problem I was reporting. Probably - which is why I asked for more details. You can find the discussion thread here:
http://www.mail-archive.com/numpy-discussion@scipy.org/msg26235.html
I have proposed that at the very least, an example of this problem is added to the documentation of loadtxt so that users know to be aware of this possibility. I did not connect the ticket to that email thread. Removing the structured array part of your email, I think essentially the argument is which should be the output of: np.loadtxt(StringIO("89.23")) np.arange(5)[1]
These return an 0-d array and an rather old argument about that (which may address the other part of the ticket). Really I see this behavior as standard so you add an example to the documentation to reflect that.
In addition, loadtxt fails on empty files even when provided with a dtype. I believe genfromtxt also fails as well in this case.
Ben Root
Errors on empty files probably should be a new bug report as that was not in the ticket. Bruce
On Mon, Feb 28, 2011 at 11:45 AM, Bruce Southey <bsouthey@gmail.com> wrote:
On 02/28/2011 09:47 AM, Benjamin Root wrote:
On Mon, Feb 28, 2011 at 9:25 AM, Bruce Southey <bsouthey@gmail.com> wrote:
On 02/28/2011 09:02 AM, Benjamin Root wrote: [snip]
So, is there still no hope in addressing this old bug report of mine?
http://projects.scipy.org/numpy/ticket/1562
Ben Root
I think you need to add more details to this. So do you have an example of the problem that includes code and expected output?
Perhaps genfromtxt is probably more appropriate than loadtxt for what you want:
from StringIO import StringIO import numpy as np t = StringIO("1,1.3,abcde\n2,2.3,wxyz\n1\n3,3.3,mnop") data = np.genfromtxt(t, [('myint','i8'),('myfloat','f8'),('mystring','S5')], names = ['myint','myfloat','mystring'], delimiter=",", invalid_raise=False) print 'Bad data raise\n',data
This gives the output that skips the incomplete 3rd line:
/usr/lib64/python2.7/site-packages/numpy/lib/npyio.py:1507: ConversionWarning: Some errors were detected ! Line #3 (got 1 columns instead of 3) warnings.warn(errmsg, ConversionWarning) Bad data raise [(1, 1.3, 'abcde') (2, 2.3, 'wxyz') (3, 3.3, 'mnop')]
Bruce
Bruce,
I think you mis-understood the problem I was reporting.
Probably - which is why I asked for more details.
You can find the discussion thread here:
http://www.mail-archive.com/numpy-discussion@scipy.org/msg26235.html
I have proposed that at the very least, an example of this problem is added to the documentation of loadtxt so that users know to be aware of this possibility.
I did not connect the ticket to that email thread. Removing the structured array part of your email, I think essentially the argument is which should be the output of: np.loadtxt(StringIO("89.23")) np.arange(5)[1]
These return an 0-d array and an rather old argument about that (which may address the other part of the ticket). Really I see this behavior as standard so you add an example to the documentation to reflect that.
I agree that this behavior has become standard, and, by-and-large, desirable. It just comes with this sneaky pitfall when encountering single-line files. Therefore, I have a couple of suggestions that I would find suitable for resolution of this report. I will leave it up to the developers to decide which course to pursue. 1. Add a "mindims" parameter that would default to None (for current behavior). The caller can specify the minimum number of dimensions the resulting array should have and then call some sort of function like np.atleast_nd() (I know it doesn't exists, but such a function might be useful). The documentation for this keyword param would allude to the rational for its use. 2. Keep the current behavior, but possibly not for when a dtype is specified. Given that the squeeze() was meant for addressing the situation where the data structure is not known a priori, squeezing a known dtype seems to go against this rationale. 3. Keep the current behavior, but add some documentation for loadtxt() that illustrates the problem and shows the usage of a function like np.atleast_2d(). I would be willing to write up such an example.
In addition, loadtxt fails on empty files even when provided with a dtype. I believe genfromtxt also fails as well in this case.
Ben Root
Errors on empty files probably should be a new bug report as that was not in the ticket.
Done: http://projects.scipy.org/numpy/ticket/1752 Thanks, Ben Root
On Fri, Jan 28, 2011 at 1:36 AM, Charles R Harris <charlesr.harris@gmail.com
wrote:
On Thu, Jan 27, 2011 at 9:17 AM, Mark Wiebe <mwwiebe@gmail.com> wrote:
On Thu, Jan 27, 2011 at 7:09 AM, Ralf Gommers < ralf.gommers@googlemail.com> wrote:
<snip> The PIL test can still be fixed before the final 0.9.0 release, it looks like we will need another RC anyway. Does anyone have time for this in the next few days?
I've attached a patch which fixes it for me.
Thanks, I'll check and apply it.
I took a shot at fixing the ABI compatibility, and if PyArray_ArrFunc
> was the main issue, then that might be done. An ABI compatible 1.6 with the > datetime and half types should be doable, just some extensions might get > confused if they encounter arrays made with the new data types. > > Even if you fixed the ABI incompatibility (I don't know enough about the issue to confirm that), I'm not sure how much value there is in a release with as main new feature two dtypes that are not going to work well with scipy/other binaries compiled against 1.5.
I've recently gotten the faster ufunc NEP implementation finished except for generalized ufuncs, and most things work the same or faster with it. Below are some timings of 1.5.1 vs the new_iterator branch. In particular, the overhead on small arrays hasn't gotten worse, but the output memory layout speeds up some operations by a lot.
Your new additions indeed look quite promising. I tried your new_iterator branch but ran into a segfault immediately on running the tests on OS X. I opened a ticket for it, to not mix it into this discussion about releases too much: http://projects.scipy.org/numpy/ticket/1724.
Is that a non-Intel platform? While I tried to get aligned access right, it's likely there's a bug in it somewhere.
No, standard Intel and i386 Python.
Before we decide on a 1.6 release I would suggest to do at least the
following: - review of ABI fixes by someone very familiar with the problem that occurred in 1.4.0 (David, Pauli, Charles?) - test on Linux, OS X and Windows 32-bit and 64-bit. Also with an MSVC build on Windows, since that exposes more issues each release.
All tests pass for me now, maybe it's a good time to merge the branch into the trunk so we can run it on the buildbot?
Might be better to merge your unadulterated stuff into master, make a 1.6 branch, and add the compatibility fixes in the branch. You can test branches on the buildbot I think, at least that worked for svn, I haven't tried it with github.
The buildbot is not working with github yet.
Ralf
On 01/25/2011 10:28 PM, Mark Wiebe wrote:
On Tue, Jan 25, 2011 at 5:18 PM, Charles R Harris <charlesr.harris@gmail.com <mailto:charlesr.harris@gmail.com>> wrote:
On Tue, Jan 25, 2011 at 1:13 PM, Travis Oliphant <oliphant@enthought.com <mailto:oliphant@enthought.com>> wrote:
On Jan 25, 2011, at 10:42 AM, Charles R Harris wrote:
> Hi All, > > Just thought it was time to start discussing a release schedule for numpy 2.0 so we have something to aim at. I'm thinking sometime in the period April-June might be appropriate. There is a lot coming with the next release: the Enthought's numpy refactoring, Mark's float16 and iterator work, and support for IronPython. How do things look to the folks involved in those projects?
My suggestion is to do a 1.6 relatively soon, as the current trunk feels pretty stable to me, and it would be nice to release the features without having to go through the whole merging process.
I would target June / July at this point ;-) I know I deserve a "I told you so" from Chuck --- I will take it.
How much remains to get done?
My changes probably make merging the refactor more challenging too.
There is a bit of work that Mark is doing that would be good to include, also some modifications to the re-factoring that will support better small array performance.
Not everything needs to go into first release as long as the following releases are backward compatible. So the ABI needs it's final form as soon as possible. Is it still in flux?
I would suggest it is - there are a number of things I think could be improved in it, and it would be nice to bake in the underlying support features to make lazy/deferred evaluation of array expressions possible.
It may make sense for a NumPy 1.6 to come out in March / April in the interim.
Pulling out the changes to attain backward compatibility isn't getting any easier. I'd rather shoot for 2.0 in June. What can the rest of us do to help move things along?
I took a shot at fixing the ABI compatibility, and if PyArray_ArrFunc was the main issue, then that might be done. An ABI compatible 1.6 with the datetime and half types should be doable, just some extensions might get confused if they encounter arrays made with the new data types.
-Mark
I do understand that it may take time for the 'dust to settle' but there is the opportunity to implement aspects that may require 'significant' notification or least start the process for any appropriate changes. So, would it be possible to start developing some strategic plan of the changes that will occur? The type of things I think are in terms of: 1) Notifying/warning users of the API changes that will occur. I agree with Chuck that other 'eyes' need to see it. 2) Add any desired depreciation warnings but I do not know of any. Perhaps the files in numpy/oldnumeric and numpy/numarray - if these are still important then these should have a better home since both have not had a release since mid 2006. 3) Changes or reorganization of the namespace. My personal one is my ticket 1051 (Renaming and removing NaN and related IEEE754 special cases): http://projects.scipy.org/numpy/ticket/1051 Hopefully some of it will be applied. 4) Changes in functions. Examples: Ticket 1262 (genfromtxt: dtype should be None by default) http://projects.scipy.org/numpy/ticket/1262 Tickets 465 and 518 related to the accumulator dtype argument issues because this topic keeps appearing on the list. http://projects.scipy.org/numpy/ticket/518 http://projects.scipy.org/numpy/ticket/465 For example, perhaps changing the default arguments of mean in numpy/core/fromnumeric.py as that allows the old behavior to remain by changing the dtype argument: Change: def mean(a, axis=None, dtype=None, out=None): To: def mean(a, axis=None, dtype=float, out=None): 5) Adding any enhancement patches like median of Ticket 1213 http://projects.scipy.org/numpy/ticket/1213 Bruce
On 01/26/2011 01:42 AM, Charles R Harris wrote:
Hi All,
Just thought it was time to start discussing a release schedule for numpy 2.0 so we have something to aim at. I'm thinking sometime in the period April-June might be appropriate. There is a lot coming with the next release: the Enthought's numpy refactoring, Mark's float16 and iterator work, and support for IronPython. How do things look to the folks involved in those projects?
One thing which I was wondering about numpy 2.0: what's the story for the C-API compared to 1.x for extensions. Is it fundamentally different so that extensions will need to be rewritten ? I especially wonder about scipy and cython's codegen backend, cheers, David
On Tue, Jan 25, 2011 at 6:05 PM, David <david@silveregg.co.jp> wrote:
On 01/26/2011 01:42 AM, Charles R Harris wrote:
Hi All,
Just thought it was time to start discussing a release schedule for numpy 2.0 so we have something to aim at. I'm thinking sometime in the period April-June might be appropriate. There is a lot coming with the next release: the Enthought's numpy refactoring, Mark's float16 and iterator work, and support for IronPython. How do things look to the folks involved in those projects?
One thing which I was wondering about numpy 2.0: what's the story for the C-API compared to 1.x for extensions. Is it fundamentally different so that extensions will need to be rewritten ? I especially wonder about scipy and cython's codegen backend,
The C-API looks the same but anything hard coded type numbers and such will have problems. I would like to see the initial parts of the merge go in as early as possible so we can start chasing down any problems that turn up. Chuck
On 01/26/2011 02:05 AM, David wrote:
On 01/26/2011 01:42 AM, Charles R Harris wrote:
Hi All,
Just thought it was time to start discussing a release schedule for numpy 2.0 so we have something to aim at. I'm thinking sometime in the period April-June might be appropriate. There is a lot coming with the next release: the Enthought's numpy refactoring, Mark's float16 and iterator work, and support for IronPython. How do things look to the folks involved in those projects?
One thing which I was wondering about numpy 2.0: what's the story for the C-API compared to 1.x for extensions. Is it fundamentally different so that extensions will need to be rewritten ? I especially wonder about scipy and cython's codegen backend,
For CPython, my understanding is that extensions that access struct fields directly without accessor macros need to be changed, but not much else. There's a "backwards-compatability" PyArray_* API for CPython. That doesn't work for .NET, but neither does anything else in C extensions. So in the SciPy port to .NET there's my efforts to replace f2py with fwrap/Cython, and many SciPy C extensions will be rewritten in Cython. These will use the Npy_* interface (or backwards-compatability PyArray_* wrappers in numpy.pxd, but these only work in Cython under .NET, not in C, due to typing issues (what is "object" and so on)). Dag Sverre
On Wed, Jan 26, 2011 at 6:47 PM, Dag Sverre Seljebotn <dagss@student.matnat.uio.no> wrote:
On 01/26/2011 02:05 AM, David wrote:
On 01/26/2011 01:42 AM, Charles R Harris wrote:
Hi All,
Just thought it was time to start discussing a release schedule for numpy 2.0 so we have something to aim at. I'm thinking sometime in the period April-June might be appropriate. There is a lot coming with the next release: the Enthought's numpy refactoring, Mark's float16 and iterator work, and support for IronPython. How do things look to the folks involved in those projects?
One thing which I was wondering about numpy 2.0: what's the story for the C-API compared to 1.x for extensions. Is it fundamentally different so that extensions will need to be rewritten ? I especially wonder about scipy and cython's codegen backend,
For CPython, my understanding is that extensions that access struct fields directly without accessor macros need to be changed, but not much else. There's a "backwards-compatability" PyArray_* API for CPython.
That doesn't work for .NET, but neither does anything else in C extensions. So in the SciPy port to .NET there's my efforts to replace f2py with fwrap/Cython, and many SciPy C extensions will be rewritten in Cython. These will use the Npy_* interface (or backwards-compatability PyArray_* wrappers in numpy.pxd, but these only work in Cython under .NET, not in C, due to typing issues (what is "object" and so on)).
Ok, good to know. A good test would be to continuously build numpy + scipy on top of it ASAP. Do you think cython (or is it sage) could donate some CPU resources on the cython CI server for numpy ? I could spend some time to make that work, cheers, David
Dag Sverre _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
participants (11)
-
Benjamin Root
-
Bruce Southey
-
Charles R Harris
-
Dag Sverre Seljebotn
-
David
-
David Cournapeau
-
John Salvatier
-
Mark Wiebe
-
Pauli Virtanen
-
Ralf Gommers
-
Travis Oliphant