Hi all, I need to take the determinants of a large number of 3x3 matrices, in order to determine for each of N points, in which of M tetrahedral cells they lie. I arrange the matrices in an ndarray of shape (N,M,5,3,3). As far as I can tell, Numpy doesn't have a function to do determinants over a specific set of axes... so I can't say: dets = np.linalg.det(a, axis1=-1, axis2=-1) Using an explicit Python loop is very slow... doing the determinants of 100,000 random 3x3 matrices in a loop and inserting the results into a preallocated results array takes about 5.1 seconds. So instead I coded up my own routine to do the determinants over the last 2 axes, based on the naive textbook formula for a 3x3 determinant: def det3(ar): a=ar[...,0,0]; b=ar[...,0,1]; c=ar[...,0,2] d=ar[...,1,0]; e=ar[...,1,1]; f=ar[...,1,2] g=ar[...,2,0]; h=ar[...,2,1]; i=ar[...,2,2] return (a*e*i+b*f*g+c*d*h)-(g*e*c+h*f*a+i*d*b) This is *much much* faster... it does the same 100,000 determinants in 0.07 seconds. And the results differ from linalg.det's results by less than 1 part in 10^15 on average (w/ a std dev of about 1 part in 10^13). Does anyone know of any well-written routines to do lots and lots of determinants of a fixed size? My routine has a couple problems I can see: * it's fast enough for 100,000 determinants, but it bogs due to all the temporary arrays when I try to do 1,000,000 determinants (=72 MB array) * I've heard the numerical stability of the naive determinant algorithms is fairly poor, so I'm reluctant to use this function on real data. Any advice will be appreciated! Dan
On Mon, 25 Aug 2008 03:48:54 +0000, Daniel Lenski wrote:
* it's fast enough for 100,000 determinants, but it bogs due to all the temporary arrays when I try to do 1,000,000 determinants (=72 MB array)
I've managed to reduce the memory usage significantly by getting the number of temporary arrays down to exactly two: def det3(ar): a=ar[...,0,0]; b=ar[...,0,1]; c=ar[...,0,2] d=ar[...,1,0]; e=ar[...,1,1]; f=ar[...,1,2] g=ar[...,2,0]; h=ar[...,2,1]; i=ar[...,2,2] t=a.copy(); t*=e; t*=i; tot =t t=b.copy(); t*=f; t*=g; tot+=t t=c.copy(); t*=d; t*=h; tot+=t t=g.copy(); t*=e; t*=c; tot-=t t=h.copy(); t*=f; t*=a; tot-=t t=i.copy(); t*=d; t*=b; tot-=t return tot Now it runs very fast with 1,000,000 determinants to do (<10X the time required to do 100,000) but I'm still worried about the stability with real-world data. Dan
2008/8/25 Daniel Lenski <dlenski@gmail.com>:
On Mon, 25 Aug 2008 03:48:54 +0000, Daniel Lenski wrote:
* it's fast enough for 100,000 determinants, but it bogs due to all the temporary arrays when I try to do 1,000,000 determinants (=72 MB array)
I've managed to reduce the memory usage significantly by getting the number of temporary arrays down to exactly two:
def det3(ar): a=ar[...,0,0]; b=ar[...,0,1]; c=ar[...,0,2] d=ar[...,1,0]; e=ar[...,1,1]; f=ar[...,1,2] g=ar[...,2,0]; h=ar[...,2,1]; i=ar[...,2,2]
t=a.copy(); t*=e; t*=i; tot =t t=b.copy(); t*=f; t*=g; tot+=t t=c.copy(); t*=d; t*=h; tot+=t t=g.copy(); t*=e; t*=c; tot-=t t=h.copy(); t*=f; t*=a; tot-=t t=i.copy(); t*=d; t*=b; tot-=t
return tot
Now it runs very fast with 1,000,000 determinants to do (<10X the time required to do 100,000) but I'm still worried about the stability with real-world data.
As far as I know, that's the best you can do (though come to think of it, a 3D determinant is a cross-product followed by a dot product, so you might be able to cheat and use some built-in routines). It's a current limitation of numpy that there is not much support for doing many linear algebra problems. We do have perennial discussions about improving support for array-of-matrices calculations, and in fact currently in numpy SVN is code to allow C code doing a 3x3 determinant to be easily made into a "generalized ufunc" that does ufunc-like broadcasting and iteration over all but the last two axes. Anne
On Sun, Aug 24, 2008 at 9:48 PM, Daniel Lenski <dlenski@gmail.com> wrote:
Hi all, I need to take the determinants of a large number of 3x3 matrices, in order to determine for each of N points, in which of M tetrahedral cells they lie. I arrange the matrices in an ndarray of shape (N,M,5,3,3).
If you step back a bit and describe the actual problem you have, rather than your current solution, it might be that there are algorithms in scipy to solve it. Chuck
On Sun, 24 Aug 2008 23:49:45 -0600, Charles R Harris wrote:
On Sun, Aug 24, 2008 at 9:48 PM, Daniel Lenski <dlenski@gmail.com> wrote:
Hi all, I need to take the determinants of a large number of 3x3 matrices, in order to determine for each of N points, in which of M tetrahedral cells they lie. I arrange the matrices in an ndarray of shape (N,M,5,3,3).
If you step back a bit and describe the actual problem you have, rather than your current solution, it might be that there are algorithms in scipy to solve it.
Hi Charles, I have an irregular/unstructured mesh of tetrahedral cells, and I need to interpolate some data over this mesh, at arbitrary points within the complete volume. So this is basically 3D interpolation. I've done some looking into this already, and it seems that the only facility for 3D interpolation in Scipy is map_coordinates, which only works on a regular grid. So I have to roll my own interpolation! I start by trying to figure out in which of the tetrahedral cells each interpolation point lies. The standard way to do this is to check that for every three vertices of each tetrahedron (v1,v2,v3), the point in question lies on the same side as the fourth point (v4). This can be checked with: | v1x-x v1y-y v1z-z | | v1x-v4x v1y-v4y v1z-v4z | | v2x-x v2y-y v2z-z | = | v2x-v4x v2y-v4y v2z-v4z | | v3x-x v3y-y v3z-z | | v3x-v4x v3y-v4y v3z-v4z | (Here's a description of a nearly identical algorithm: http:// steve.hollasch.net/cgindex/geometry/ptintet.html) Dan
participants (4)
-
Anne Archibald
-
Charles R Harris
-
Dan Lenski
-
Daniel Lenski