
Hi, I am pleased to announce the release of the rc1 for numpy 1.3.0. You can find source tarballs and installers for both Mac OS X and Windows on the sourceforge page: https://sourceforge.net/projects/numpy/ <https://sourceforge.net/projects/numpy/> The release note for the 1.3.0 release are below, The Numpy developers ========================= NumPy 1.3.0 Release Notes ========================= This minor includes numerous bug fixes, official python 2.6 support, and several new features such as generalized ufuncs. Highlights ========== Python 2.6 support ~~~~~~~~~~~~~~~~~~ Python 2.6 is now supported on all previously supported platforms, including windows. http://www.python.org/dev/peps/pep-0361/ Generalized ufuncs ~~~~~~~~~~~~~~~~~~ There is a general need for looping over not only functions on scalars but also over functions on vectors (or arrays), as explained on http://scipy.org/scipy/numpy/wiki/GeneralLoopingFunctions. We propose to realize this concept by generalizing the universal functions (ufuncs), and provide a C implementation that adds ~500 lines to the numpy code base. In current (specialized) ufuncs, the elementary function is limited to element-by-element operations, whereas the generalized version supports "sub-array" by "sub-array" operations. The Perl vector library PDL provides a similar functionality and its terms are re-used in the following. Each generalized ufunc has information associated with it that states what the "core" dimensionality of the inputs is, as well as the corresponding dimensionality of the outputs (the element-wise ufuncs have zero core dimensions). The list of the core dimensions for all arguments is called the "signature" of a ufunc. For example, the ufunc numpy.add has signature "(),()->()" defining two scalar inputs and one scalar output. Another example is (see the GeneralLoopingFunctions page) the function inner1d(a,b) with a signature of "(i),(i)->()". This applies the inner product along the last axis of each input, but keeps the remaining indices intact. For example, where a is of shape (3,5,N) and b is of shape (5,N), this will return an output of shape (3,5). The underlying elementary function is called 3*5 times. In the signature, we specify one core dimension "(i)" for each input and zero core dimensions "()" for the output, since it takes two 1-d arrays and returns a scalar. By using the same name "i", we specify that the two corresponding dimensions should be of the same size (or one of them is of size 1 and will be broadcasted). The dimensions beyond the core dimensions are called "loop" dimensions. In the above example, this corresponds to (3,5). The usual numpy "broadcasting" rules apply, where the signature determines how the dimensions of each input/output object are split into core and loop dimensions: While an input array has a smaller dimensionality than the corresponding number of core dimensions, 1's are pre-pended to its shape. The core dimensions are removed from all inputs and the remaining dimensions are broadcasted; defining the loop dimensions. The output is given by the loop dimensions plus the output core dimensions. Experimental Windows 64 bits support ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Numpy can now be built on windows 64 bits (amd64 only, not IA64), with both MS compilers and mingw-w64 compilers: This is *highly experimental*: DO NOT USE FOR PRODUCTION USE. See INSTALL.txt, Windows 64 bits section for more information on limitations and how to build it by yourself. New features ============ Formatting issues ~~~~~~~~~~~~~~~~~ Float formatting is now handled by numpy instead of the C runtime: this enables locale independent formatting, more robust fromstring and related methods. Special values (inf and nan) are also more consistent across platforms (nan vs IND/NaN, etc...), and more consistent with recent python formatting work (in 2.6 and later). Nan handling in max/min ~~~~~~~~~~~~~~~~~~~~~~~ The maximum/minimum ufuncs now reliably propagate nans. If one of the arguments is a nan, then nan is retured. This affects np.min/np.max, amin/amax and the array methods max/min. New ufuncs fmax and fmin have been added to deal with non-propagating nans. Nan handling in sign ~~~~~~~~~~~~~~~~~~~~ The ufunc sign now returns nan for the sign of anan. New ufuncs ~~~~~~~~~~ #. fmax - same as maximum for integer types and non-nan floats. Returns the non-nan argument if one argument is nan and returns nan if both arguments are nan. #. fmin - same as minimum for integer types and non-nan floats. Returns the non-nan argument if one argument is nan and returns nan if both arguments are nan. #. deg2rad - converts degrees to radians, same as the radians ufunc. #. rad2deg - converts radians to degrees, same as the degrees ufunc. #. log2 - base 2 logarithm. #. exp2 - base 2 exponential. #. trunc - truncate floats to nearest integer towards zero. #. logaddexp - add numbers stored as logarithms and return the logarithm of the result. #. logaddexp2 - add numbers stored as base 2 logarithms and return the base 2 logarithm of the result result. Masked arrays ~~~~~~~~~~~~~ Several new features and bug fixes, including: * structured arrays should now be fully supported by MaskedArray (r6463, r6324, r6305, r6300, r6294...) * Minor bug fixes (r6356, r6352, r6335, r6299, r6298) * Improved support for __iter__ (r6326) * made baseclass, sharedmask and hardmask accesible to the user (but read-only) * doc update gfortran support on windows ~~~~~~~~~~~~~~~~~~~~~~~~~~~ Gfortran can now be used as a fortran compiler for numpy on windows, even when the C compiler is Visual Studio (VS 2005 and above; VS 2003 will NOT work). Gfortran + Visual studio does not work on windows 64 bits (but gcc + gfortran does). It is unclear whether it will be possible to use gfortran and visual studio at all on x64. Arch option for windows binary ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Automatic arch detection can now be bypassed from the command line for the superpack installed: numpy-1.3.0-superpack-win32.exe /arch=nosse will install a numpy which works on any x86, even if the running computer supports SSE set. Deprecated features =================== Histogram ~~~~~~~~~ The semantics of histogram has been modified to fix long-standing issues with outliers handling. The main changes concern #. the definition of the bin edges, now including the rightmost edge, and #. the handling of upper outliers, now ignored rather than tallied in the rightmost bin. The previous behavior is still accessible using `new=False`, but this is deprecated, and will be removed entirely in 1.4.0. Documentation changes ===================== A lot of documentation has been added. Both user guide and references can be built from sphinx. New C API ========= Multiarray API ~~~~~~~~~~~~~~ The following functions have been added to the multiarray C API: * PyArray_GetEndianness: to get runtime endianness Ufunc API ~~~~~~~~~~~~~~ The following functions have been added to the ufunc API: * PyUFunc_FromFuncAndDataAndSignature: to declare a more general ufunc (generalized ufunc). New defines ~~~~~~~~~~~ New public C defines are available for ARCH specific code through numpy/npy_cpu.h: * NPY_CPU_X86: x86 arch (32 bits) * NPY_CPU_AMD64: amd64 arch (x86_64, NOT Itanium) * NPY_CPU_PPC: 32 bits ppc * NPY_CPU_PPC64: 64 bits ppc * NPY_CPU_SPARC: 32 bits sparc * NPY_CPU_SPARC64: 64 bits sparc * NPY_CPU_S390: S390 * NPY_CPU_IA64: ia64 * NPY_CPU_PARISC: PARISC New macros for CPU endianness has been added as well (see internal changes below for details): * NPY_BYTE_ORDER: integer * NPY_LITTLE_ENDIAN/NPY_BIG_ENDIAN defines Those provide portable alternatives to glibc endian.h macros for platforms without it. Portable NAN, INFINITY, etc... ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ npy_math.h now makes available several portable macro to get NAN, INFINITY: * NPY_NAN: equivalent to NAN, which is a GNU extension * NPY_INFINITY: equivalent to C99 INFINITY * NPY_PZERO, NPY_NZERO: positive and negative zero respectively Corresponding single and extended precision macros are available as well. All references to NAN, or home-grown computation of NAN on the fly have been removed for consistency. Internal changes ================ numpy.core math configuration revamp ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ This should make the porting to new platforms easier, and more robust. In particular, the configuration stage does not need to execute any code on the target platform, which is a first step toward cross-compilation. http://projects.scipy.org/numpy/browser/trunk/doc/neps/math_config_clean.txt umath refactor ~~~~~~~~~~~~~~ A lot of code cleanup for umath/ufunc code (charris). Improvements to build warnings ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Numpy can now build with -W -Wall without warnings http://projects.scipy.org/numpy/browser/trunk/doc/neps/warnfix.txt Separate core math library ~~~~~~~~~~~~~~~~~~~~~~~~~~ The core math functions (sin, cos, etc... for basic C types) have been put into a separate library; it acts as a compatibility layer, to support most C99 maths functions (real only for now). The library includes platform-specific fixes for various maths functions, such as using those versions should be more robust than using your platform functions directly. The API for existing functions is exactly the same as the C99 math functions API; the only difference is the npy prefix (npy_cos vs cos). The core library will be made available to any extension in 1.4.0. CPU arch detection ~~~~~~~~~~~~~~~~~~ npy_cpu.h defines numpy specific CPU defines, such as NPY_CPU_X86, etc... Those are portable across OS and toolchains, and set up when the header is parsed, so that they can be safely used even in the case of cross-compilation (the values is not set when numpy is built), or for multi-arch binaries (e.g. fat binaries on Max OS X). npy_endian.h defines numpy specific endianness defines, modeled on the glibc endian.h. NPY_BYTE_ORDER is equivalent to BYTE_ORDER, and one of NPY_LITTLE_ENDIAN or NPY_BIG_ENDIAN is defined. As for CPU archs, those are set when the header is parsed by the compiler, and as such can be used for cross-compilation and multi-arch binaries. Checksums ========= 5c6b2f02d0846317c6e7bffa39f6f828 release/installers/numpy-1.3.0rc1.zip 20cdddd69594420b0f8556bbc4a27a5a release/installers/numpy-1.3.0rc1.tar.gz f85231c4a27b39f7cb713ef22926931e release/installers/numpy-1.3.0rc1-py2.5-macosx10.5.dmg b24bb536492502611ea797d9410bb7c2 release/installers/numpy-1.3.0rc1-win32-superpack-python2.5.exe

Hi all, On Mar 28, 2009, at 9:26 AM, David Cournapeau wrote:
I am pleased to announce the release of the rc1 for numpy 1.3.0. You can find source tarballs and installers for both Mac OS X and Windows on the sourceforge page:
https://sourceforge.net/projects/numpy/ <https://sourceforge.net/projects/numpy/>
I have a PPC Mac, dual G5, running 10.5.6. The Mac OS X installer (numpy-1.3.0rc1-py2.5-macosx10.5.dmg) did not work for me. It said none of my disks were suitable for installation. The last time around, numpy-1.3.0b1-py2.5- macosx10.5.dmg persisted in installing itself into the system python rather than the Enthought distribution that I use, so I installed that version from the source tarball. This time, installing from the source tarball also went smoothly. Testing seems okay:
np.test() Running unit tests for numpy NumPy version 1.3.0rc1 NumPy is installed in /Library/Frameworks/Python.framework/Versions/ 4.1.30101/lib/python2.5/site-packages/numpy Python version 2.5.2 |EPD Py25 4.1.30101| (r252:60911, Dec 19 2008, 15:28:32) [GCC 4.0.1 (Apple Computer, Inc. build 5370)] nose version 0.10.3 ........................................................................ ........................................................................ ........................................................................ ........................................................................ ........................................................................ ........................................................................ ........................................................................ ................................................K...K................... ........................................................................ ........................................................................ ........................................................................ ........................................................................ ........................................................................ ........................................................................ ........................................................................ .......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................S..............................................................................................................................................................................................................................................................................................................................................................................................................................
Ran 2030 tests in 13.930s OK (KNOWNFAIL=2, SKIP=1) <nose.result.TextTestResult run=2030 errors=0 failures=0> Bob

Hi Robert, Thanks for the report. On Sun, Mar 29, 2009 at 12:10 AM, Robert Pyle <rpyle@post.harvard.edu> wrote:
Hi all,
On Mar 28, 2009, at 9:26 AM, David Cournapeau wrote:
I am pleased to announce the release of the rc1 for numpy 1.3.0. You can find source tarballs and installers for both Mac OS X and Windows on the sourceforge page:
https://sourceforge.net/projects/numpy/ <https://sourceforge.net/projects/numpy/>
I have a PPC Mac, dual G5, running 10.5.6.
The Mac OS X installer (numpy-1.3.0rc1-py2.5-macosx10.5.dmg) did not work for me. It said none of my disks were suitable for installation.
Hm, strange, I have never encountered this problem. To be sure I understand, you could open/mount the .dmg, but the .pkg refuses to install ?
The last time around, numpy-1.3.0b1-py2.5- macosx10.5.dmg persisted in installing itself into the system python rather than the Enthought distribution that I use, so I installed that version from the source tarball.
I am afraid there is nothing I can do here - the installer can only work with the system python I believe (or more exactly the python version I built the package against). Maybe people more familiar with bdist_mpkg could prove me wrong ? cheers, David

Hi David, On Mar 28, 2009, at 12:04 PM, David Cournapeau wrote:
Hi Robert,
Thanks for the report.
On Sun, Mar 29, 2009 at 12:10 AM, Robert Pyle <rpyle@post.harvard.edu> wrote:
The Mac OS X installer (numpy-1.3.0rc1-py2.5-macosx10.5.dmg) did not work for me. It said none of my disks were suitable for installation.
Hm, strange, I have never encountered this problem. To be sure I understand, you could open/mount the .dmg, but the .pkg refuses to install ?
Yes. When it gets to "Select a Destination", I would expect my boot disk to get the green arrow as the installation target, but it (and the other three disks) have the exclamation point in the red circle. Same thing happened on my MacBook Pro (Intel) with its one disk. As I noted before, however, installation from source went without problems on both machines. Bob

Robert Pyle wrote:
Yes. When it gets to "Select a Destination", I would expect my boot disk to get the green arrow as the installation target, but it (and the other three disks) have the exclamation point in the red circle. Same thing happened on my MacBook Pro (Intel) with its one disk.
Now that I think about it, maybe this is due to the lack of a python interpreter from python.org on your side. Did you install any other python besides the one included in EPD ? If that's the problem, we should at least mention somewhere that the python from python.org is needed. cheers, David

Hi David, On Mar 29, 2009, at 4:03 AM, David Cournapeau wrote:
Robert Pyle wrote:
Yes. When it gets to "Select a Destination", I would expect my boot disk to get the green arrow as the installation target, but it (and the other three disks) have the exclamation point in the red circle. Same thing happened on my MacBook Pro (Intel) with its one disk.
Now that I think about it, maybe this is due to the lack of a python interpreter from python.org on your side. Did you install any other python besides the one included in EPD ? If that's the problem, we should at least mention somewhere that the python from python.org is needed.
Okay, I just installed 2.6.1 from python.org, and it is now the version that starts when I type "python" to Terminal. I still cannot install numpy-1.3.0rc1 from the OS X installer, numpy-1.3.0rc1-py2.5- macosx10.5.dmg Bob

On Sun, Mar 29, 2009 at 7:43 AM, Robert Pyle <rpyle@post.harvard.edu> wrote:
Hi David,
On Mar 29, 2009, at 4:03 AM, David Cournapeau wrote:
Robert Pyle wrote:
Yes. When it gets to "Select a Destination", I would expect my boot disk to get the green arrow as the installation target, but it (and the other three disks) have the exclamation point in the red circle. Same thing happened on my MacBook Pro (Intel) with its one disk.
Now that I think about it, maybe this is due to the lack of a python interpreter from python.org on your side. Did you install any other python besides the one included in EPD ? If that's the problem, we should at least mention somewhere that the python from python.org is needed.
Okay, I just installed 2.6.1 from python.org, and it is now the version that starts when I type "python" to Terminal. I still cannot install numpy-1.3.0rc1 from the OS X installer, numpy-1.3.0rc1-py2.5- macosx10.5.dmg
Yes, you can't install a python 2.5 package on python 2.6. It is almost like installing from sources is actually easier than from an installer on mac os x... I can relatively easily provide a 2.6 installer, though. cheers, David

On Mar 29, 2009, at 10:53 AM, David Cournapeau wrote:
On Sun, Mar 29, 2009 at 7:43 AM, Robert Pyle <rpyle@post.harvard.edu> wrote:
Hi David,
On Mar 29, 2009, at 4:03 AM, David Cournapeau wrote:
Robert Pyle wrote:
Yes. When it gets to "Select a Destination", I would expect my boot disk to get the green arrow as the installation target, but it (and the other three disks) have the exclamation point in the red circle. Same thing happened on my MacBook Pro (Intel) with its one disk.
Now that I think about it, maybe this is due to the lack of a python interpreter from python.org on your side. Did you install any other python besides the one included in EPD ? If that's the problem, we should at least mention somewhere that the python from python.org is needed.
Okay, I just installed 2.6.1 from python.org, and it is now the version that starts when I type "python" to Terminal. I still cannot install numpy-1.3.0rc1 from the OS X installer, numpy-1.3.0rc1-py2.5- macosx10.5.dmg
Yes, you can't install a python 2.5 package on python 2.6. It is almost like installing from sources is actually easier than from an installer on mac os x...
I just installed 2.5.4 from python.org, and the OS X installer still doesn't work. This is on a PPC G5; I haven't tried it on my Intel MacBook Pro. Bob

On Mon, Mar 30, 2009 at 3:36 AM, Robert Pyle <rpyle@post.harvard.edu> wrote:
I just installed 2.5.4 from python.org, and the OS X installer still doesn't work. This is on a PPC G5; I haven't tried it on my Intel MacBook Pro.
I think I got it. To build numpy, I use virtualenv to make a "bootstrap" environment, but then the corresponding python path get embedded in the .mpkg - so unless you have your python interpreter in exactly the same path as my bootstrap (which is very unlikely), it won't run at all. This would also explain why I never saw the problem. I will prepare a new binary, cheers, David

David Cournapeau wrote:
On Mon, Mar 30, 2009 at 3:36 AM, Robert Pyle <rpyle@post.harvard.edu> wrote:
I just installed 2.5.4 from python.org, and the OS X installer still doesn't work. This is on a PPC G5; I haven't tried it on my Intel MacBook Pro.
Could you try this one ? http://www.ar.media.kyoto-u.ac.jp/members/david/archives/numpy/numpy-1.3.0rc... If it does not work, getting the /var/tmp/install.log would be helpful (the few last lines), cheers, David

Hi David, I decided to change the Subject line to be more apropos. On Mar 30, 2009, at 3:41 AM, David Cournapeau wrote:
David Cournapeau wrote:
On Mon, Mar 30, 2009 at 3:36 AM, Robert Pyle <rpyle@post.harvard.edu> wrote:
I just installed 2.5.4 from python.org, and the OS X installer still doesn't work. This is on a PPC G5; I haven't tried it on my Intel MacBook Pro.
Could you try this one ?
http://www.ar.media.kyoto-u.ac.jp/members/david/archives/numpy/numpy-1.3.0rc...
This one installs, but only in /Library/Python/2.5/site-packages/, that is, for Apple's system python. This happened when `which python` pointed to either EPD python or python.org's 2.5.4.
If it does not work, getting the /var/tmp/install.log would be helpful (the few last lines),
/var/tmp/ had a bunch of stuff in it, but no file named install.log. Perhaps that's because the installation succeeded?
Bob

On Mon, Mar 30, 2009 at 11:06 PM, Robert Pyle <rpyle@post.harvard.edu> wrote:
Hi David,
I decided to change the Subject line to be more apropos.
On Mar 30, 2009, at 3:41 AM, David Cournapeau wrote:
David Cournapeau wrote:
On Mon, Mar 30, 2009 at 3:36 AM, Robert Pyle <rpyle@post.harvard.edu> wrote:
I just installed 2.5.4 from python.org, and the OS X installer still doesn't work. This is on a PPC G5; I haven't tried it on my Intel MacBook Pro.
Could you try this one ?
http://www.ar.media.kyoto-u.ac.jp/members/david/archives/numpy/numpy-1.3.0rc...
This one installs, but only in /Library/Python/2.5/site-packages/, that is, for Apple's system python. This happened when `which python` pointed to either EPD python or python.org's 2.5.4.
Yes, what your default python is does not matter: I don't know the details, but it looks like the mac os x installer only looks whether a python binary exists in /System/Library/..., that is the one I used to build the package. You can see this in the Info.plist inside the .mpkg.
If it does not work, getting the /var/tmp/install.log would be helpful (the few last lines),
/var/tmp/ had a bunch of stuff in it, but no file named install.log. Perhaps that's because the installation succeeded?
It it because I mistyped the path... logs are in /var/log/install.log. I tried to find a way to at least print something about missing requirement, but it does not look like there is a lot of documentation out there on the apple installer. cheers, David

David Cournapeau wrote:
On Mon, Mar 30, 2009 at 11:06 PM, Robert Pyle <rpyle@post.harvard.edu> wrote:
This one installs, but only in /Library/Python/2.5/site-packages/, that is, for Apple's system python. This happened when `which python` pointed to either EPD python or python.org's 2.5.4.
Yes, what your default python is does not matter: I don't know the details, but it looks like the mac os x installer only looks whether a python binary exists in /System/Library/..., that is the one I used to build the package. You can see this in the Info.plist inside the .mpkg.
Well, this is the big question: what python(s) should be provide binaries for -- I think if you're only going to do one, it should be the python.org build, so that you can support 10.4, and 10.5 and everyone can use it. There are ways to build an installer that puts it in a place that both can find it -- wxPython does this -- but I'm not so sure that's a good idea. One of the key questions is how one should think of Apple's Python. They are using it for some system tools, so we really shouldn't break it. If you upgrade the numpy it comes with, there is some chance that you could break something. Also, Apple has not (and likely won't) upgrade their Python. I know I happened to run into a bug and needed a newer 2.5, so I'd rather have the control. A few years ago the MacPython community (as represented by the members of the pythonmac list) decided that the python.org build was that one that we should all target for binaries. That consensus has weakened with 10.5, as Apple did provide a Python that is fairly up to date and almost fully functional, but I think it's still a lot easier on everyone if we just stick with the python.org build as the one to target for binaries. That being said, it shouldn't be hard to build separate binaries for each python -- they would be identical except for where they get installed, and if they are clearly marked for downloading, there shouldn't be too much confusion. -Chris

On Tue, Mar 31, 2009 at 12:51 AM, Chris Barker <Chris.Barker@noaa.gov> wrote:
Well, this is the big question: what python(s) should be provide binaries for -- I think if you're only going to do one, it should be the python.org build, so that you can support 10.4, and 10.5 and everyone can use it.
I don't really care, as long as there is only one. Maintaing binaries for every python out there is too time consuming. Given that mac os x is the easiest platform to build numpy/scipy on, that's not something i am interested in.
There are ways to build an installer that puts it in a place that both can find it -- wxPython does this -- but I'm not so sure that's a good idea.
there is the problem of compatibility. I am not sure whether Apple python and python.org are ABI compatible - even if the version is the same, you can certainly build incompatible python (I don't know if that's the case on mac os).
Also, Apple has not (and likely won't) upgrade their Python. I know I happened to run into a bug and needed a newer 2.5, so I'd rather have the control.
That's a rather convincing argument. I will thus build binaries against python.org binaries (I still have to find a way to guarantee this in the build script, but that should not be too difficult).
That being said, it shouldn't be hard to build separate binaries for each python -- they would be identical except for where they get installed, and if they are clearly marked for downloading, there shouldn't be too much confusion.
My experience is that every choice presented to the user makes for more problem. And that just takes too much time. I prefer spending time making a few good installers rather than many half baked. Ideally, we should have something which could install on every python version, but oh well, David

David Cournapeau wrote:
I don't really care, as long as there is only one. Maintaining binaries for every python out there is too time consuming. Given that mac os X is the easiest platform to build numpy/scipy on,
I assume you meant NOT the easiest? ;-)
that's not something i am interested in.
quite understandable.
There are ways to build an installer that puts it in a place that both can find it -- wxPython does this -- but I'm not so sure that's a good idea.
there is the problem of compatibility. I am not sure whether Apple python and python.org are ABI compatible
In theory, yes, and in practice, it seems to be working for wxPython. However, I agree that it's a bit risky. I'm at the PyCon MacPython sprint as we type -- and apparently Apple's is linked with the 10.5 sdk, whereas python.org's is linked against the 10.3 sdk -- so there could be issues.
I will thus build binaries against python.org binaries (I still have to find a way to guarantee this in the build script, but that should not be too difficult).
Hardcoding the path to python should work: PYTHON=/Library/Frameworks/Python.framework/Versions/2.5/bin/python
My experience is that every choice presented to the user makes for more problem. And that just takes too much time. I prefer spending time making a few good installers rather than many half baked.
I agree -- and most packages I use seem to supporting python.org exclusively for binaries.
Ideally, we should have something which could install on every python version, but oh well,
well, I guess that's the promise of easy_install -- but someone would have to build all the binary eggs... and there were weird issues with universal eggs on the mac that I understand have been fixed in 2.6, but not 2.5 Thanks for all your work on this, -Chris

On Tue, Mar 31, 2009 at 2:10 AM, Chris Barker <Chris.Barker@noaa.gov> wrote:
David Cournapeau wrote:
I don't really care, as long as there is only one. Maintaining binaries for every python out there is too time consuming. Given that mac os X is the easiest platform to build numpy/scipy on,
I assume you meant NOT the easiest? ;-)
Actually, no, I meant it :) It has gcc, which is the best supported compiler by numpy and scipy, there is almost no problem with g77, and the optimized blas/lapack is provided by the OS vendor, meaning on ABI issue, weird atlas build errors, etc... It is almost impossible to get the build wrong on mac os x once you get the right fortran compiler.
In theory, yes, and in practice, it seems to be working for wxPython. However, I agree that it's a bit risky. I'm at the PyCon MacPython sprint as we type -- and apparently Apple's is linked with the 10.5 sdk, whereas python.org's is linked against the 10.3 sdk -- so there could be issues.
I am almost certain there are issues in some configurations, in particular x86_64. I don't know the details, but I have seen mentioned several time this kind of problems: http://osdir.com/ml/python-dev/2009-02/msg00339.html I can see how this could cause trouble.
I will thus build binaries against python.org binaries (I still have to find a way to guarantee this in the build script, but that should not be too difficult).
Hardcoding the path to python should work:
PYTHON=/Library/Frameworks/Python.framework/Versions/2.5/bin/python
Well, yes, but you can't really control this in the bdist_mpkg command. Also, my current paver file uses virtualenv to build a isolated numpy - that's what breaks the .mpkg, but I like this approach for building, so I would like to keep it as much as possible.
well, I guess that's the promise of easy_install -- but someone would have to build all the binary eggs... and there were weird issues with universal eggs on the mac that I understand have been fixed in 2.6, but not 2.5
There are numerous problems with eggs (or more precisely, with "easy" install), which I am just not interested in getting into. In particular, it often breaks the user system - fixing it is easy for developers/"power users", but is a PITA for normal users. As long as easy_install is broken, I don't want to use it. cheers, David

David Cournapeau wrote:
On Tue, Mar 31, 2009 at 2:10 AM, Chris Barker <Chris.Barker@noaa.gov> wrote:
I assume you meant NOT the easiest? ;-)
Actually, no, I meant it :) It has gcc, which is the best supported compiler by numpy and scipy, there is almost no problem with g77, and the optimized blas/lapack is provided by the OS vendor, meaning on ABI issue, weird atlas build errors, etc... It is almost impossible to get the build wrong on mac os x once you get the right fortran compiler.
I see -- well that's good news. I've found the Universal library requirements to be a pain sometimes, and it probably would be here if Apple wasn't giving us lapack/blas.
I am almost certain there are issues in some configurations, in particular x86_64.
Well, neither Apple nor python.org's builds are 64 bit anyway at this point. There is talk of quad (i386,and ppc_64 i86_64) builds the the future, though.
I will thus build binaries against python.org binaries (I still have to find a way to guarantee this in the build script, but that should not be too difficult). Hardcoding the path to python should work:
PYTHON=/Library/Frameworks/Python.framework/Versions/2.5/bin/python
Well, yes, but you can't really control this in the bdist_mpkg command.
bdist_mpkg should do "the right thing" if it's run with the right python. So you need to make sure you run: /Library/Frameworks/Python.framework/Versions/2.5/bin/bdist_mpkg Rather than whatever one happens to be found on your PATH.
Also, my current paver file uses virtualenv to build a isolated numpy - that's what breaks the .mpkg, but I like this approach for building, so I would like to keep it as much as possible.
Well, maybe we need to hack bdist_mpkg to support this, we're pretty sure that it is possible. I want o make sure I understand what you want: Do you want to be able to build numpy in a virtualenv, and then build a mpkg that will install into the users regular Framework? Do you want to be able to build a mpkg that users can install into the virtualenv of their choice? Both? Of course, easy_install can do that, when it works!
There are numerous problems with eggs (or more precisely, with "easy" install), which I am just not interested in getting into.
me neither --
In particular, it often breaks the user system - fixing it is easy for developers/"power users", but is a PITA for normal users. As long as easy_install is broken, I don't want to use it.
We were just talking about some of that last night -- we really need a "easy_uninstall" for instance. I'm going to poke into bdist_mpkg a bit right now. By the way, for the libgfortran issue, while statically linking it may be the best option, it wouldn't be too hard to have the mpkg include and install /usr/local/lib/ligfortran.dylib (or whatever). -Chris

Chris Barker wrote:
I see -- well that's good news. I've found the Universal library requirements to be a pain sometimes, and it probably would be here if Apple wasn't giving us lapack/blas.
Yes, definitely. I could see a lot of trouble if people had to build a universal ATLAS :)
Well, neither Apple nor python.org's builds are 64 bit anyway at this point. There is talk of quad (i386,and ppc_64 i86_64) builds the the future, though.
Yes, but that's something that has to should be supported sooner rather than later.
bdist_mpkg should do "the right thing" if it's run with the right python. So you need to make sure you run:
/Library/Frameworks/Python.framework/Versions/2.5/bin/bdist_mpkg
Rather than whatever one happens to be found on your PATH.
Yes, that's the problem: this cannot work directly if I use virtual env, since virtual env works by recreating a 'fake' python somewhere else.
Well, maybe we need to hack bdist_mpkg to support this, we're pretty sure that it is possible.
I want o make sure I understand what you want:
Do you want to be able to build numpy in a virtualenv, and then build a mpkg that will install into the users regular Framework?
Yes - more exactly, there should be a way to guarantee that if I create a virtual env from a given python interpreter, I can target a .mpkg to this python interpreter.
Do you want to be able to build a mpkg that users can install into the virtualenv of their choice?
No - virtualenv is only an artefact of the build process - users should not care or even know I use virtualenv. I use virtualenv as a fast, poor-man's 'python chroot'. This way, I can build and install python in a directory with minimum interaction with the outside environment. Installing is necessary to build the doc correctly, and I don't want to mess my system with setuptools stuff.
Of course, easy_install can do that, when it works!
Except when it doesn't :)
We were just talking about some of that last night -- we really need a "easy_uninstall" for instance.
yes - but I think it is very difficult to do right with the current design of easy_install (I have thought a bit about those issues recently, and I have started writing something to organize my thought a bit better - I can keep you posted if you are interested).
By the way, for the libgfortran issue, while statically linking it may be the best option, it wouldn't be too hard to have the mpkg include and install /usr/local/lib/ligfortran.dylib (or whatever).
I don't think it is a good idea: it would overwrite existing libgfortran.dylib, which would cause a lot of issues because libgfortran and gfortran have to be consistent. I know I would be very pissed if after installing a software, some unrelated software would be broken or worse overwritten. That's exactly what bothers me with easy_install. cheers, David

David Cournapeau wrote:
Chris Barker wrote:
Well, neither Apple nor python.org's builds are 64 bit anyway at this point. There is talk of quad (i386,and ppc_64 i86_64) builds the the future, though.
Yes, but that's something that has to should be supported sooner rather than later.
It does, but we don't need a binary installer for a python that doesn't have a binary installer.
Well, maybe we need to hack bdist_mpkg to support this, we're pretty sure that it is possible.
Yes - more exactly, there should be a way to guarantee that if I create a virtual env from a given python interpreter, I can target a .mpkg to this python interpreter.
Hmmm -- I don't know virtualenv enough to know what the virtualenv knows about how it was created... However, I'm not sure you need to do what your saying here. I imagine this workflow: set up a virtualenv for, say numpy x.y.rc-z play around with it, get everything to build, etc. with plain old setup.py build, setup.py install, etc. Once you are happy, run: /Library/Frameworks/Python.framework/Versions/2.5/bin/bdist_mpkg (or the 2.6 equivalent, etc) I THINK you'd get a .mpkg that was all set for the user to install in their Framework python. As long as you don't run the installer, you won't end up with it in your virtualenv. Or is this what you've tried and has failed for you? By the way, if you run bdist_mpkg from a version installed into your virtualenv, you will get an installer that will install into your virtualenv, whit the path hard coded, so really useless.
Installing is necessary to build the doc correctly, and I don't want to mess my system with setuptools stuff.
ah -- maybe that's the issue then -- darn. Are the docs included in the .mpkg? Do they need to be built for that?
I have started writing something to organize my thought a bit better - I can keep you posted if you are interested).
yes, I am.
By the way, for the libgfortran issue, while statically linking it may be the best option, it wouldn't be too hard to have the mpkg include and install /usr/local/lib/ligfortran.dylib (or whatever).
I don't think it is a good idea: it would overwrite existing libgfortran.dylib, which would cause a lot of issues because libgfortran and gfortran have to be consistent. I know I would be very pissed if after installing a software, some unrelated software would be broken or worse overwritten.
True. In that case we could put the dylib somewhere obscure: /usr/local/lib/scipy1.6/lib/ or even: /Library/Frameworks/Python.framework/Versions/2.5/lib/ But using static linking is probably better. Actually, and I betray my ignorance here, but IIUC: - There are a bunch of different scipy extensions that use libgfortran - Many of them are built more-or-less separately - So each of them would get their own copy of the static libgfortran - Just how many separate copies of libgfortran is that? - Enough to care? - How big is libgfortran? This is making me think solving the dynamic linking problem makes sense. Also, would it break anything if the libgfortran installed were properly versioned: libgfortran.a.b.c Isn't that the point of versioned libs? -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

Christopher Barker wrote:
It does, but we don't need a binary installer for a python that doesn't have a binary installer.
Yes, not now - but I would prefer avoiding to have to change the process again when time comes. It may not look like it, but enabling a working process which works well on all platforms including windows took me several days to work properly. And I am in a hurry to go into this again :)
Hmmm -- I don't know virtualenv enough to know what the virtualenv knows about how it was created...
However, I'm not sure you need to do what your saying here. I imagine this workflow:
set up a virtualenv for, say numpy x.y.rc-z
play around with it, get everything to build, etc. with plain old setup.py build, setup.py install, etc.
Once you are happy, run:
/Library/Frameworks/Python.framework/Versions/2.5/bin/bdist_mpkg
This means building the same thing twice. Now, for numpy, it is not that a big deal, but for scipy, not so much. If/when we have a good, reliable build farm for mac os x, this point becomes moot, so.
By the way, if you run bdist_mpkg from a version installed into your virtualenv, you will get an installer that will install into your virtualenv, whit the path hard coded, so really useless.
That's exactly the problem in the current binary :)
ah -- maybe that's the issue then -- darn. Are the docs included in the .mpkg? Do they need to be built for that?
The docs are included in the .dmg, and yes, the doc needs to be built from the same installation (or more exactly the same source).
yes, I am.
I have not tackled the uninstall part, but I already wrote this to "write in stone" my POV in the whole python packaging situation: http://cournape.wordpress.com/2009/04/01/python-packaging-a-few-observations...
True. In that case we could put the dylib somewhere obscure:
/usr/local/lib/scipy1.6/lib/
Hm, that's strange - why /usr/local/lib ? It is outside the scipy installation.
or even:
/Library/Frameworks/Python.framework/Versions/2.5/lib/
That's potentially dangerous: since this directory is likely to be in LIBDIR, it means libgfortran will be taken there or from /usr/local/lib if the user builds numpy/scipy after installing numpy. If it is incompatible with the user gfortran, it will lead to weird issues, hard to debug. This problem bit me on windows 64 bits recently: we did something similar (creating a libpython*.a and put in C:\python*\libs), but the installed library was not 64 bits compatible - I assumed this library was built by python itself, and I have wasted several hours looking elsewhere for a problem caused by numpy.distutils. If we install something like libgfortran, it should be installed privately - but dynamically linking against private libraries is hard, because that's very platform dependent (in particular on windows, I have yet to see a sane solution - everyone just copy the private .dll alongside the binaries, AFAICS). Now, if you bring me a solution to this problem, I would be *really* glad.
Actually, and I betray my ignorance here, but IIUC:
- There are a bunch of different scipy extensions that use libgfortran - Many of them are built more-or-less separately - So each of them would get their own copy of the static libgfortran
AFAIK, statically linking a library does not mean the whole copy is put into the binary. I guess different binary formats do it differently, but for example, on linux: gfortran hello.f -> a.out is 8 kb gfortran hello.f -static-libgfortran -> a.out is 130 kb libgfortran.a -> ~ 1.3Mb Of course, this depends on the function you need to pull out from libgfortran - but I guess we do not pull so much, because we mainly use intrinsics (gfortran math functions, etc...) which should be very small. I don't think we use so much the IO fortran runtime - actually, we should explicitly avoid it since it cause trouble because the C and fortran runtimes would 'fight' each other with unspeakable consequences. And thinking about it: mac os x rather encourage big binaries - "fat binary", so I am not sure it is a big concern.
This is making me think solving the dynamic linking problem makes sense.
It makes sense for a whole lot of reasons, but it is hard. The problem is much bigger on windows (where almost *everything* is statically linked), and I tried to tackle this to ship only one numpy installer with 3 dynamically loaded atlas at runtime - I did not find a workable solution.
Also, would it break anything if the libgfortran installed were properly versioned:
libgfortran.a.b.c
Isn't that the point of versioned libs?
versioned libraries only make sense for shared libraries,I think. On Linux, the static library is not even publicly available (it is in /usr/lib/gcc/4.3.3). I wonder whether the mac os x gfortran binary did not make a mistake there, actually. cheers, David

David Cournapeau wrote:
Christopher Barker wrote:
It does, but we don't need a binary installer for a python that doesn't have a binary installer.
Yes, not now - but I would prefer avoiding to have to change the process again when time comes. It may not look like it, but enabling a working process which works well on all platforms including windows took me several days to work properly.
I'm not surprised it took that long -- that sounds short to me! Anyway, If there are new python builds that are 64-bit (quad?) you wont' have to change much -- "only" make sure that the libs you are linking to are 64 bit. I suppose you could try to get a quad-universal gfortran.a now, but I'd wait 'till you need it.
However, I'm not sure you need to do what your saying here. I imagine this workflow:
set up a virtualenv for, say numpy x.y.rc-z
play around with it, get everything to build, etc. with plain old setup.py build, setup.py install, etc.
Once you are happy, run:
/Library/Frameworks/Python.framework/Versions/2.5/bin/bdist_mpkg
This means building the same thing twice.
does it? I just did test: I set up a virtualenv with nothing in it. I used that environment to do: setup.py build setup.py install and got an apparently working numpy in my virtualenv. Then I ran: /Library/Frameworks/Python.framework/Versions/2.5/bin/bdist_mpkg It did a lot, but I don't THINK is re-compiled everthing. I think the trick is that eveything it built was still in the build dir -- and it is the saem python, even though it's not living in the same place. I got what seems to be a functioning Universal Installer for the python.org python. Having said that, it looks like it may be very easy to hack a package built in the virtualenv to install in the right place. In: dist/numpy-1.3.0rc1-py2.5-macosx10.4.mpkg/Contents/Packages/ there are two mpkgs. IN each of those, there is: Contents/Info.plist which is an xml file. In there, there is: <key>IFPkgFlagDefaultLocation</key> which should be set to: <string>/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages</string> If it was built in a virtualenv, that would be the virtualenv path. It's a hack, but you could post-process the mpkg to change that value after building.
The docs are included in the .dmg, and yes, the doc needs to be built from the same installation (or more exactly the same source).
In my tests, that seems to work fine.
True. In that case we could put the dylib somewhere obscure:
/usr/local/lib/scipy1.6/lib/
Hm, that's strange - why /usr/local/lib ? It is outside the scipy installation.
I'm not sure, really, except that that's where Robin Dunn puts stuff for wxPython -- I think one reason may be that he can then point to it from more than one Python installation -- for instance Apple's and python.orgs. And it could help with building from a virtualenv.
or even:
/Library/Frameworks/Python.framework/Versions/2.5/lib/
That's potentially dangerous: since this directory is likely to be in LIBDIR, it means libgfortran will be taken there or from /usr/local/lib if the user builds numpy/scipy after installing numpy. If it is incompatible with the user gfortran, it will lead to weird issues, hard to debug.
showing what a pain all of this is! Of course, you could put it in: /Library/Frameworks/Python.framework/Versions/2.5/lib/site_packages/scipy/lib/ or something that only scipy would know about.
If we install something like libgfortran, it should be installed privately - but dynamically linking against private libraries is hard, because that's very platform dependent
yup -- on the Mac, it could work well, 'cause the paths to the libs are hard-coded when linked, so you WILL get the right one -- if it's there! Tough this could lead to trouble when you want to build from a virtualenv, and install to the system location. mocaholib gives you tools to re-write the locations, but someone would have to write that code.
gfortran hello.f -> a.out is 8 kb gfortran hello.f -static-libgfortran -> a.out is 130 kb libgfortran.a -> ~ 1.3Mb
And thinking about it: mac os x rather encourage big binaries - "fat binary", so I am not sure it is a big concern.
so I guess we're back to static linking -- I guess that's why it's generally "recommended" for distributing binaries.
Also, would it break anything if the libgfortran installed were properly versioned: libgfortran.a.b.c
versioned libraries only make sense for shared libraries,I think.
right -- I meant for the dynamic lib option.
On Linux, the static library is not even publicly available (it is in /usr/lib/gcc/4.3.3). I wonder whether the mac os x gfortran binary did not make a mistake there, actually.
where are you getting that? I'm about to go on vacation, but maybe I could try a few things... -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

Christopher Barker wrote:
Anyway, If there are new python builds that are 64-bit (quad?) you wont' have to change much -- "only" make sure that the libs you are linking to are 64 bit. I suppose you could try to get a quad-universal gfortran.a now,
Actually, it looks like the binary at: http://r.research.att.com/tools/ Is already a quad-universal build. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

Christopher Barker wrote:
I'm not surprised it took that long -- that sounds short to me!
Anyway, If there are new python builds that are 64-bit (quad?) you wont' have to change much -- "only" make sure that the libs you are linking to are 64 bit. I suppose you could try to get a quad-universal gfortran.a now, but I'd wait 'till you need it.
My main concern was about making sure I always built against the same python interpreter. For now, I have hardcoded the python executable to use for the mpkg, but that's not good enough.
It did a lot, but I don't THINK is re-compiled everthing. I think the trick is that eveything it built was still in the build dir -- and it is the saem python, even though it's not living in the same place.
Well, yes, it could be a totally different, incompatible python that it would still do as you said - distutils cannot be trusted at all to do the right thing here, it has no real dependency system. The problem is to make sure that bdist_mpkg and the installed numpy in virtual env are built against the same python binary: - it makes sure the build is actually compatible - it also gives a sanity check about 'runnability' of the binary - at least, numpy can be imported (for the doc), and by building from scratch, I lose this. Of course, the real solution would be to automatically test for the built numpy - I unfortunately did not have the time to do this correctly,
<key>IFPkgFlagDefaultLocation</key>
which should be set to:
<string>/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages</string>
If it was built in a virtualenv, that would be the virtualenv path.
It's a hack, but you could post-process the mpkg to change that value after building.
This has the same problem - it works as long as the virtual python and the 'python targetted by the binary installer are the same'.
showing what a pain all of this is! Of course, you could put it in:
/Library/Frameworks/Python.framework/Versions/2.5/lib/site_packages/scipy/lib/
or something that only scipy would know about.
That's the only correct solution I can see, yes.
On Linux, the static library is not even publicly available (it is in /usr/lib/gcc/4.3.3). I wonder whether the mac os x gfortran binary did not make a mistake there, actually.
where are you getting that? I'm about to go on vacation, but maybe I could try a few things...
On linux, libgfortran.a is /usr/lib/gcc/i486-linux-gnu/4.3/libgfortran.a -> this is private On mac OS X, it is /usr/local/lib -> this is public Exactly the kind of 'small' things which end up with quite some headache when you care about reliability and repeatability. As a related problem, I am not a big fan about the Apple way of building fat binaries, I much prefer the approach one build/arch and merging the binaries after build with lipo. This is more compatible with all autoconf projects out there - and it means using/updating the compilers is easy (building multi-arch binaries the 'apple' way is really a pain - the scripts used by R, to build the universal gfortran, is quite complicated). I think this above problem would not have happened with a pristine gfortran built from sources, cheers, David

David Cournapeau wrote:
Well, yes, it could be a totally different, incompatible python that it would still do as you said - distutils cannot be trusted at all to do the right thing here,
quite true -- though hopefully you now what is on your system, so it can only go so wrong. However I just noticed that in a brand new virtualenv, there is ".Python" file, which is a symlink to: /Library/Frameworks/Python.framework/Versions/2.5/Python So you may be able to use that to ensure that the python you build the .mpkg with is the same as the one in the virtualenv.
- it also gives a sanity check about 'runnability' of the binary - at least, numpy can be imported (for the doc)
but it may not help with that. I think some of this could be improved by hacking bdist_mpkg, but someone would have to find the time to do that. It's still going to be a pain until core python supports some kind of versioning or virtualenv system, and all the tools then support that.
<key>IFPkgFlagDefaultLocation</key>
which should be set to:
<string>/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages</string>
If it was built in a virtualenv, that would be the virtualenv path.
This has the same problem - it works as long as the virtual python and the 'python targetted by the binary installer are the same'.
so maybe the above could help.
/Library/Frameworks/Python.framework/Versions/2.5/lib/site_packages/scipy/lib/ or something that only scipy would know about.
That's the only correct solution I can see, yes.
So it's that or static libs -- I don't understand what the static lib problem is, so I'm not help there.
where are you getting that? I'm about to go on vacation, but maybe I could try a few things...
On linux, libgfortran.a is /usr/lib/gcc/i486-linux-gnu/4.3/libgfortran.a -> this is private On mac OS X, it is /usr/local/lib -> this is public
I meant where did you get/how did you build gfortran in the first place. Are you using this one? http://r.research.att.com/tools/ They seem to be quad-architecture and have the static libs -- I haven't tried to build anything with it yet, though.
As a related problem, I am not a big fan about the Apple way of building fat binaries, I much prefer the approach one build/arch and merging the binaries after build with lipo.
I don't know -- the biggest issues I've seen are with build systems that assume you are building for the architecture you are on, which would break either way, unless you build each piece on a different machine. I just discovered that mac port now support building Universal libs for at least some packages -- when it does, it seems to work great. NO gfortran, though.... I hope some of this helps. I really appreciate all you've done on this, -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

On Mar 30, 2009, at 2:56 AM, David Cournapeau wrote:
On Mon, Mar 30, 2009 at 3:36 AM, Robert Pyle <rpyle@post.harvard.edu> wrote:
I just installed 2.5.4 from python.org, and the OS X installer still doesn't work. This is on a PPC G5; I haven't tried it on my Intel MacBook Pro.
I think I got it. To build numpy, I use virtualenv to make a "bootstrap" environment, but then the corresponding python path get embedded in the .mpkg - so unless you have your python interpreter in exactly the same path as my bootstrap (which is very unlikely), it won't run at all. This would also explain why I never saw the problem.
This is exactly the problem. This is the error message that you get when running the .dmg and no hard drives are available for selection. You cannot install numpy 1.3.0rc1 on this volume. numpy requires /Users/david/src/dsp/numpy/1.3.x/bootstrap Python 2.5 to install.
I will prepare a new binary,
Any idea when a new binary will be available on sourceforge.net? Cheers Tommy

Hi all, On Mar 28, 2009, at 9:26 AM, David Cournapeau wrote:
I am pleased to announce the release of the rc1 for numpy 1.3.0. You can find source tarballs and installers for both Mac OS X and Windows on the sourceforge page:
https://sourceforge.net/projects/numpy/ <https://sourceforge.net/projects/numpy/>
On my Intel Mac (MacBook Pro), the OS X installer refused to recognize my disk as an installation target, just as it did on my dual G5 PPC. Installation from the tarball was successful, and numpy.test() was okay (KNOWNFAIL=1, SKIP=1). Bob

On 3/28/2009 9:26 AM David Cournapeau apparently wrote:
I am pleased to announce the release of the rc1 for numpy 1.3.0. You can find source tarballs and installers for both Mac OS X and Windows on the sourceforge page: https://sourceforge.net/projects/numpy/
Was the Python 2.6 Superpack intentionally omitted? Alan Isaac

On Sun, Mar 29, 2009 at 10:09 AM, Alan G Isaac <aisaac@american.edu> wrote:
On 3/28/2009 9:26 AM David Cournapeau apparently wrote:
I am pleased to announce the release of the rc1 for numpy 1.3.0. You can find source tarballs and installers for both Mac OS X and Windows on the sourceforge page: https://sourceforge.net/projects/numpy/
Was the Python 2.6 Superpack intentionally omitted?
No, I've added it. The 64 bits binary will come later David

On Sat, Mar 28, 2009 at 2:26 PM, David Cournapeau <david@ar.media.kyoto-u.ac.jp> wrote:
Hi,
I am pleased to announce the release of the rc1 for numpy 1.3.0. You can find source tarballs and installers for both Mac OS X and Windows on the sourceforge page:
For the beta release, I can see both numpy-1.3.0b1-win32-superpack-python2.5.exe and numpy-1.3.0b1-win32-superpack-python2.6.exe However, for the first release candidate I can only see numpy-1.3.0rc1-win32-superpack-python2.5.exe - no Python 2.6 version. Is this an oversight, or maybe some caching issue with the sourceforge mirror system? In the meantime I'll give the beta a go on Python 2.6 on my Windows XP machine... Thanks, Peter

Peter wrote:
On Sat, Mar 28, 2009 at 2:26 PM, David Cournapeau <david@ar.media.kyoto-u.ac.jp> wrote:
Hi,
I am pleased to announce the release of the rc1 for numpy 1.3.0. You can find source tarballs and installers for both Mac OS X and Windows on the sourceforge page:
For the beta release, I can see both numpy-1.3.0b1-win32-superpack-python2.5.exe and numpy-1.3.0b1-win32-superpack-python2.6.exe
However, for the first release candidate I can only see numpy-1.3.0rc1-win32-superpack-python2.5.exe - no Python 2.6 version.
I uploaded it but forgot to update it on the sourceforge download page. This should be fixed, David

Hi, It might be too late (I was off-line last week), but anyway: I have set the milestone for the ticket 1036 [1] to 1.4, but it does not change the existing functionality, brings some new one, and the tests pass, so I wonder if it could get it into the 1.3 release? cheers, r. [1] http://projects.scipy.org/numpy/ticket/1036 David Cournapeau wrote:
Hi,
I am pleased to announce the release of the rc1 for numpy 1.3.0. You can find source tarballs and installers for both Mac OS X and Windows on the sourceforge page:
https://sourceforge.net/projects/numpy/ <https://sourceforge.net/projects/numpy/>
The release note for the 1.3.0 release are below,
The Numpy developers

Numpy 1.3.0 rc1 fails this self-test on Solaris. ====================================================================== FAIL: Test find_duplicates ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/ra/pyssg/2.5.1/numpy/lib/tests/test_recfunctions.py", line 163, in test_find_duplicates assert_equal(test[0], a[control]) File "/usr/stsci/pyssgdev/2.5.1/numpy/ma/testutils.py", line 121, in assert_equal return assert_array_equal(actual, desired, err_msg) File "/usr/stsci/pyssgdev/2.5.1/numpy/ma/testutils.py", line 193, in assert_array_equal header='Arrays are not equal') File "/usr/stsci/pyssgdev/2.5.1/numpy/ma/testutils.py", line 186, in assert_array_compare verbose=verbose, header=header) File "/usr/stsci/pyssgdev/2.5.1/numpy/testing/utils.py", line 395, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not equal (mismatch 50.0%) x: array([(1, (2.0, 'B')), (2, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0, 'B'))], dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])]) y: array([(2, (2.0, 'B')), (1, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0, 'B'))], dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])]) ---------------------------------------------------------------------- The software I am using: NumPy version 1.3.0rc1 Python version 2.5.1 (r251:54863, Jun 4 2008, 15:48:19) [C] nose version 0.10.4 I think this identifies the compilers it was built with: customize SunFCompiler Found executable /opt/SUNWspro-6u2/bin/f90 Could not locate executable echo ranlib customize SunFCompiler customize SunFCompiler using config C compiler: cc -DNDEBUG -O -xcode=pic32 It passes in Python 2.5.1 on these machines: x86 macintosh, 32 bit Red Hat Enterprise 4 Linux, x86, 32 bit RHE 3, x86, 32 bit RHE 4, x86, 64 bit PowerPC mac, 32 bit (Yes, even the PPC mac.) I see that this is the same problem as http://projects.scipy.org/numpy/ticket/1039 but the data used in the test is different. Mark S.

On Mon, Mar 30, 2009 at 10:28 AM, Mark Sienkiewicz <sienkiew@stsci.edu>wrote:
Numpy 1.3.0 rc1 fails this self-test on Solaris.
====================================================================== FAIL: Test find_duplicates ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/ra/pyssg/2.5.1/numpy/lib/tests/test_recfunctions.py", line 163, in test_find_duplicates assert_equal(test[0], a[control]) File "/usr/stsci/pyssgdev/2.5.1/numpy/ma/testutils.py", line 121, in assert_equal return assert_array_equal(actual, desired, err_msg) File "/usr/stsci/pyssgdev/2.5.1/numpy/ma/testutils.py", line 193, in assert_array_equal header='Arrays are not equal') File "/usr/stsci/pyssgdev/2.5.1/numpy/ma/testutils.py", line 186, in assert_array_compare verbose=verbose, header=header) File "/usr/stsci/pyssgdev/2.5.1/numpy/testing/utils.py", line 395, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not equal
(mismatch 50.0%) x: array([(1, (2.0, 'B')), (2, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0, 'B'))], dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])]) y: array([(2, (2.0, 'B')), (1, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0, 'B'))], dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])])
----------------------------------------------------------------------
The software I am using:
NumPy version 1.3.0rc1 Python version 2.5.1 (r251:54863, Jun 4 2008, 15:48:19) [C] nose version 0.10.4
These are new (two months old) tests. Hmm, they are also marked as known failures on win32. I wonder why they fail there and not on linux? I think you should open a ticket for this. Chuck

====================================================================== FAIL: Test find_duplicates ---------------------------------------------------------------------- ... AssertionError: Arrays are not equal
(mismatch 50.0%) x: array([(1, (2.0, 'B')), (2, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0, 'B'))], dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])]) y: array([(2, (2.0, 'B')), (1, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0, 'B'))], dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])])
----------------------------------------------------------------------
These are new (two months old) tests. Hmm, they are also marked as known failures on win32. I wonder why they fail there and not on linux? I think you should open a ticket for this.
I'm not sure how old the test is, but I see that it has been failing since Feb 1. (That is the earliest report I have online at the moment.) The ticket is http://projects.scipy.org/numpy/ticket/1039 . I added this specific failure mode to the ticket today. It does not surprise me at all when the trunk is broken on solaris. I'm mentioning it on the list because I see it is still broken in the release candidate. I assume somebody would want to either fix the problem or remove the non-working feature from the release. Mark S.

On Mon, Mar 30, 2009 at 3:04 PM, Mark Sienkiewicz <sienkiew@stsci.edu>wrote:
====================================================================== FAIL: Test find_duplicates ---------------------------------------------------------------------- ... AssertionError: Arrays are not equal
(mismatch 50.0%) x: array([(1, (2.0, 'B')), (2, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0, 'B'))], dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])]) y: array([(2, (2.0, 'B')), (1, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0, 'B'))], dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])])
----------------------------------------------------------------------
These are new (two months old) tests. Hmm, they are also marked as known failures on win32. I wonder why they fail there and not on linux? I think you should open a ticket for this.
I'm not sure how old the test is, but I see that it has been failing since Feb 1. (That is the earliest report I have online at the moment.)
The ticket is http://projects.scipy.org/numpy/ticket/1039 . I added this specific failure mode to the ticket today.
It does not surprise me at all when the trunk is broken on solaris. I'm mentioning it on the list because I see it is still broken in the release candidate. I assume somebody would want to either fix the problem or remove the non-working feature from the release.
I'm guessing that it is the test that needs fixing. And maybe the windows problem is related. Chuck

Mon, 30 Mar 2009 15:15:06 -0600, Charles R Harris wrote:
I'm guessing that it is the test that needs fixing. And maybe the windows problem is related.
Probably they are both related to unspecified sort order for the duplicates. There were some sort-order ignoring missing in the test. I think the test is now fixed in trunk: http://projects.scipy.org/numpy/changeset/6827 -- Pauli Virtanen

Pauli Virtanen wrote:
Probably they are both related to unspecified sort order for the duplicates. There were some sort-order ignoring missing in the test.
I think the test is now fixed in trunk:
The test passes in 1.4.0.dev6827. Tested on Solaris 8, Mac OSX 10.4 (Tiger) on x86 and ppc, and both 32 and 64 bit Red Hat Enterprise, all with Python 2.5.1. Thanks for fixing this. Mark S.

Mon, 30 Mar 2009 14:03:17 -0600, Charles R Harris wrote:
On Mon, Mar 30, 2009 at 10:28 AM, Mark Sienkiewicz <sienkiew@stsci.edu>wrote:
Numpy 1.3.0 rc1 fails this self-test on Solaris. [clip] ====================================================================== FAIL: Test find_duplicates ---------------------------------------------------------------------- assert_equal(test[0], a[control])
x: array([(1, (2.0, 'B')), (2, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0, 'B'))], dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])]) y: array([(2, (2.0, 'B')), (1, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0, 'B'))], dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])])
These are new (two months old) tests. Hmm, they are also marked as known failures on win32. I wonder why they fail there and not on linux? I think you should open a ticket for this.
The data seems to be in a different order in the index array and in the data array returned by `find_duplicates`. It is intended that find_duplicates guarantees that the returned indices correspond to the returned values? Another question: the 'recfunctions' is not imported anywhere in numpy? (BTW, it might be good not to keep commented-out code such as those np.knownfail decorators in the repository, unless it's explained why it's commented out...) -- Pauli Virtanen
participants (12)
-
Alan G Isaac
-
Charles R Harris
-
Chris Barker
-
Christopher Barker
-
David Cournapeau
-
David Cournapeau
-
Mark Sienkiewicz
-
Pauli Virtanen
-
Peter
-
Robert Cimrman
-
Robert Pyle
-
Tommy Grav