Could somebody give me a pointer to a easily installable Windows
binary kit for NumPy? Even a bit older version (such as 11) would do
fine. I couldn't find any reference to such a kit from sourceforge or
the LLNL site.
I am just trying to compile the latest CVS updates on NumPy, and am
getting very cranky about the dependence of build process on
let me say it politely, using distutils at this point is for the
maybe for distributing released versions to the general public, where
the distutils setup.py script has been tested on many platforms, is a
long term solution.
but for right now, i miss the Makefile stuff.
(gustav)~/system/numpy/Numerical/: python setup.py build
/usr/bin/egcc -c -I/usr/include/python1.5 -IInclude -O3 -mpentium -fpic Src/_numpymodule.c Src/arrayobject.c Src/ufuncobject.c
Src/ufuncobject.c:783: conflicting types for `PyUFunc_FromFuncAndData'
/usr/include/python1.5/ufuncobject.h:100: previous declaration of `PyUFunc_FromFuncAndData'
Src/ufuncobject.c: In function `PyUFunc_FromFuncAndData':
Src/ufuncobject.c:805: structure has no member named `doc'
Src/ufuncobject.c: In function `ufunc_getattr':
Src/ufuncobject.c:955: structure has no member named `doc'
[blah blah blah]
if this was from running make, i could be inside xemacs and click on
the damn error and go right to the line in question. now, i gotta
horse around, its a waste of time.
another example, i posed to c.l.p a week or so ago. where it turned
out that distutils doesnt know enough that, for example, arrayobject.h
in distribution is newer than the one in /usr/include/python1.5 so
breaks the build when API changes have been made. distutils should at
leaast get the order of the -I's correct. but we're not a distutils
SIG, we're NumPy, right?
can we please go back, at least for now, to using -- or at a minimum
distributing -- the Makefile's?
<P><A HREF="http://lcdx00.wm.lc.ehu.es/~jsaenz/pyclimate">Pyclimate 0.0</A> - Climate variability analysis using Numeric Python (28-Mar-00)
We are making the first announce of a pre-alpha release (version 0.0) of
our package pyclimate, which presents some tools used for climate
variability analysis and which make extensive use of Numerical Python.
It is released under the GNU Public License.
We call them a pre-alpha release. Even though the routines are
quite debugged, they are yet growing and we are thinking in making
a stable release shortly after receiving some feedback from users.
The package contains:
-ASCII files (simple, but useful)
-ncstruct.py: netCDF structure copier. From a COARDS compliant netCDF
file, this module creates a COARDS compliant file,
copying the needed attributes, dimensions,
auxiliary variables, comments, and so on in
Time handling routines
* JDTime.py -> Some C/Python functions to convert from date to Scaliger's
Julian Day and from Julian Day to date. We are not trying to
replace mxDate, but addressing a different problem.
In particular, this module contains a routine
especially suited to handling monthly time steps
for climatological use.
* JDTimeHandler.py -> Python module which parses the units
attribute of the time variable in a COARDS
file and which offsets and scales adequately
the time values to read/save date fields.
Interface to DCDFLIB.C
A C/Python interface to the free DCDFLIB.C library is provided. This library
allows direct and inverse computations of parameters for several
probability distribution functions like Chi^2, normal, binomial, F,
noncentral F, and many many more.
Empirical Orthogonal Function analysis based on the SVD decomposition of
the data matrix and related functions to test the reliability/degeneracy
of eigenvalues (truncation rules). Monte Carlo test of the stability
of eigenvectors to temporal subsampling.
SVD decomposition of the correlation matrix of two datasets, functions
to compute the expansion coefficients, the squared cumulative covariance
fraction and the homogeneous and heterogeneous correlation maps.
Monte Carlo test of the stability of singular vectors to temporal
Multivariate digital filter
Multivariate digital filter (high and low pass) based on the
Differential operators on the sphere
Some classes to compute differential operators (gradient and divergence)
on a regular latitude/longitude grid.
To be able to use it, you will need:
1. Python ;-)
2. netCDF library 3.4 or later
3. Scientific Python, by Konrad Hinsen
4. DCDFLIB.C version 1.1
IF AND ONLY IF you really want to change the C code (JDTime.[hc] and
pycdf.[hc]), then, you will also need SWIG.
There is no a automatic compilation/installation procedure, but the
Makefile is quite straightforward.
After manually editing the Makefile for different platforms, the commands
make test -> Runs a (not infalible) regression test
will do it.
SORRY, we don't use it under Windows, only UNIX. Volunteers
that generate a Windows installation file would be appreciated, but we
will not do it.
LaTeX, Postscript and PDF versions of the manual are included in the
distribution. However, we are preparing a new set of documentation
according to PSA rules.
Any feedback from the users of the package will be really appreciated
by the authors. We will try to incorporate new developments, in case
we are able to do so. Our time availability is scarce.
Jon Saenz, jsaenz(a)wm.lc.ehu.es
Juan Zubillaga, wmpzuesj(a)lg.ehu.es
I tried to install a NumPy version downloaded this morning using
distutils also downloaded then. This failed on my Linux box. First the
setup.py script in not compatible with the newer distutil API. Second
the glibc math.h includes a mathtypes.h which defines a function named
gamma. This leads to a problem with ranlibmodule.c. I tried to solve
both problems. This is documented with the applied patch.
bhoel(a)starship.python.net / http://starship.python.net/crew/bhoel/
It is unlawful to use this email address for unsolicited ads
(USC Title 47 Sec.227). I will assess a US$500 charge for
reviewing and deleting each unsolicited ad.
Hey Numpy people!
I have to integrate functions like:
Int_0^1 (t*(t-1))**-(H/2) dt
Int_-1^1 1/abs(t)**(1-H) dt, with H around .3
I just tried quadrature on the 1st one: it needs very order of quadrature
to be precise and is in that case slow.
Would it work better with Multipack?
(I have to upgrade to python 1.5.2 to try Multipack!)
Thank you for your help.
Hey Numpy people!
I need to compute the Gamma function (the one related to factorial) with
Numpy. It seems no to be included in the module, Am I right? Is any code
for computing it available? Where could I find an algorythme to adapt ?
I have a couple of question about the Numpy documentation:
1. Is there a recent version of the Numerical Python manual available
anywhere? I can't find it at the SourceForge and I've tried
xfiles.llnl.org, but can't get through. (But from what Paul Dubois
has said recently about the LLNL site, I shouldn't expect to
2. Have any changes been made to the documentation since about Q1
1999? I think my current version dates from about this period.
As I've previously mentioned to Paul, I need a single precision version
of 'interp()', so I can use it on large single precision arrays, without
returning a double precision array.
In my own copy of NumPy, I've written such a routine, and added it to
'arrayfns.c '. Naturally, I want to see this functionality built into
the official release, so I do not have to apply my own patches to new
Can we decide how such single precision needs are accomodated? Should
there be a keyword argument on the 'interp()' call, that calls the
single precision version? Or should the caller invoke 'interpf()',
rather than 'interp()?'
I don't much care what the solution looks like, as long as people agree
1) we need such functionality in NumPy.
2) we can establish a precedent on how single precision vs double
precision methods are invoked.
Please let me know your opinions on how this should be resolved.
National Center for Atmospheric Research