is there a C-API function for numpy which implements Python's
multidimensional indexing? Say, I have a 2d-array
PyArrayObject * M;
and an index
how do I extract the i-th row or column M[i,:] respectively M[:,i]?
I am looking for a function which gives again a PyArrayObject * and
which is a view to M (no copied data; the result should be another
PyArrayObject whose data and strides points to the correct memory
portion of M).
I searched the API documentation, Google and mailing lists for quite a
long time but didn't find anything. Can you help me?
Hello list.. I've run into two SVD errors over the last few days. Both
errors are identical in numpy/scipy.
I've submitted a ticket for the 1st problem (numpy ticket #990). Summary
is: some builds of the lapack_lite module linking against system LAPACK
(not the bundled dlapack_lite.o, etc) give a "LinAlgError: SVD did not
converge" exception on my matrix. This error does occur using Mac's
Accelerate framework LAPACK, and a coworker's Ubuntu LAPACK version. It
does not seem to happen using ATLAS LAPACK (nor using Octave/Matlab on
Just today I've come across a negative singular value cropping up in an
SVD of a different matrix. This error does occur on my ATLAS LAPACK based
numpy, as well as on the Ubuntu setup. And once again, it does not happen
I'm using numpy 1.3.0.dev6336 -- don't know what the Ubuntu box is running.
Here are some npy files for the two different cases:
My question is about reading Fortran binary file (oh no this question
Until now, I was using the unpack module like that :
from struct import unpack
"""Reading a Fortran binary file in litte-endian"""
if fourBeginning: f.seek(4,1)
for array in tuple:
for elt in xrange(array.size):
if fourEnd: f.seek(4,1)
After googling, I read that fopen and npfille was deprecated, and we should
use numpy.fromfile and ndarray.tofile, but despite of the documentaion, the
cookbook, the mailling list and google, I don't succed in making a simple
example. Considering the simple Fortran code below what is the Python script
to read the four arrrays? What about if my pc is little endian and the file
I think it will be a good idea to put the Fortran writting-arrays code and
the Python reading-array script in the cookbook and maybe a page to help
people comming from Fortran to start with Python ?
integer :: i,j
do i = 1,nx
do j = 1,ny
ux(i,j) = real(i*j)
uy(i,j) = real(i)/real(j)
p (i,j) = real(i) + real(j)
end program makeArray
It seems it's possible using e.g.
In : dtype([('foo', str)])Out: dtype([('foo', '|S0')])
to get yourself a zero-length string. However dtype('|S0') results in
a TypeError: data type not understood.
I understand the stupidity of creating a 0-length string field but
it's conceivable that it's accidental.
For example, it could lead to a situation where you've created that
field, are missing all the data you had meant to put in it, serialize
with np.save, and upon np.load aren't able to get _any_ of your data
back because the dtype descriptor is considered bogus (can you guess
why I thought of this scenario?).
It seems that either dtype(str) should do something more sensible than
zero-length string, or it should be possible to create it with dtype('|
S0'). Which should it be?
I was wondering if we could finally move to a more recent version of
compilers for official win32 installers. This would of course concern
the next release cycle, not the ones where beta/rc are already in
Basically, the pros:
- we will have to move at some point
- gcc 4.* seem less buggy, especially C++ and fortran.
- no need to maintain msvcr90 vodoo
- it will most likely break the ABI
- we need to recompile atlas (but I can take care of it)
- the biggest: it is difficult to combine gfortran with visual
studio (more exactly you cannot link gfortran runtime to a visual
studio executable). The only solution I could think of would be to
recompile the gfortran runtime with Visual Studio, which for some
reason does not sound very appealing :)
I am facing an issue upgrading numpy from 1.5.1 to 1.6.1.
In numPy 1.6, the casting behaviour for ufunc has changed and has become
Can someone advise how to implement the below simple example which worked in
1.5.1 but fails in 1.6.1?
>>> import numpy as np
>>> def add(a,b):
... return (a+b)
>>> uadd = np.frompyfunc(add,2,1)
<ufunc 'add (vectorized)'>
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ValueError: could not find a matching type for add (vectorized).accumulate,
requested type has type code 'l'
Announcing Numexpr 2.0
Numexpr is a fast numerical expression evaluator for NumPy. With it,
expressions that operate on arrays (like "3*a+4*b") are accelerated
and use less memory than doing the same calculation in Python.
It wears multi-threaded capabilities, as well as support for Intel's
VML library, which allows for squeezing the last drop of performance
out of your multi-core processors.
This version comes with support for the new iterator in NumPy
(introduced in NumPy 1.6), allowing for improved performance in
practically all the scenarios (the exception being very small arrays),
and most specially for operations implying broadcasting,
fortran-ordered or non-native byte orderings.
The carefully crafted mix of the new NumPy iterator and direct access
to data buffers turned out to be so powerful and flexible, that the
internal virtual machine has been completely revamped around this
The drawback is that you will need NumPy >= 1.6 to run numexpr 2.0.
However, NumPy 1.6 has been released more than 6 months ago now, so we
think this is a good time for taking advantage of it. Many thanks to
Mark Wiebe for such an important contribution!
For some benchmarks on the new virtual machine, see:
Also, Gaëtan de Menten contributed important bug fixes, code cleanup
as well as speed enhancements. Francesc Alted contributed some fixes,
and added compatibility code with existing applications (PyTables)
In case you want to know more in detail what has changed in this
or have a look at RELEASE_NOTES.txt in the tarball.
Where I can find Numexpr?
The project is hosted at Google code in:
You can get the packages from PyPI as well:
Share your experience
Let us know of any bugs, suggestions, gripes, kudos, etc. you may