[Numpy-discussion] Constant array of tuples

Frank Peacock frank at gis4weather.com
Fri Apr 10 18:15:10 EDT 2009


Hello

I would like to know whether there is a simple way to construct a constant
array of tuples:

How do I construct an array of size (width*height) where each element is
(w,x,y,z)

Frank

-----Original Message-----
From: numpy-discussion-bounces at scipy.org
[mailto:numpy-discussion-bounces at scipy.org] On Behalf Of
numpy-discussion-request at scipy.org
Sent: 10 April 2009 15:19
To: numpy-discussion at scipy.org
Subject: Numpy-discussion Digest, Vol 31, Issue 26

Send Numpy-discussion mailing list submissions to
	numpy-discussion at scipy.org

To subscribe or unsubscribe via the World Wide Web, visit
	http://mail.scipy.org/mailman/listinfo/numpy-discussion
or, via email, send a message with subject or body 'help' to
	numpy-discussion-request at scipy.org

You can reach the person managing the list at
	numpy-discussion-owner at scipy.org

When replying, please edit your Subject line so it is more specific
than "Re: Contents of Numpy-discussion digest..."


Today's Topics:

   1. Re: frombuffer alignment for ctypes.Structure array
      (David Cournapeau)
   2. Re: DVCS at PyCon (Ondrej Certik)
   3. Re: DVCS at PyCon (David Cournapeau)
   4. Re: DVCS at PyCon (Matthieu Brucher)
   5. numpy import on x86_64 arch (Vincent Thierion)
   6. Re: numpy import on x86_64 arch (David Cournapeau)
   7. Replacing colours in numpy array (Frank Peacock)


----------------------------------------------------------------------

Message: 1
Date: Fri, 10 Apr 2009 17:28:23 +0900
From: David Cournapeau <david at ar.media.kyoto-u.ac.jp>
Subject: Re: [Numpy-discussion] frombuffer alignment for
	ctypes.Structure array
To: Discussion of Numerical Python <numpy-discussion at scipy.org>
Message-ID: <49DF0327.5010209 at ar.media.kyoto-u.ac.jp>
Content-Type: text/plain; charset=ISO-8859-1

Roland Schulz wrote:
> The is no align or aligned option to frombuffer. What is the best way
> to tell numpy to align the data as the C-struct/ctypes.Stucture array is?

You could add a 'fake' field in between to get the right alignment, maybe ?

import numpy as N
from ctypes import *

class C(Structure):
    _fields_=[("a",c_int),("b",c_short), ('', c_short)]

c=(C*2)()
_ctypes_to_numpy = {c_short : N.int16,c_int : N.int32}
ty = N.dtype([(x,_ctypes_to_numpy[y]) for x,y in C._fields_])
x=N.frombuffer(c,ty)

You may use a different type if you need another alignment, or using
something like 'VN' in the dtype, e.g.:

a = N.dtype([('a', np.int), ('b', np.short), ('', 'V2')])

David


------------------------------

Message: 2
Date: Fri, 10 Apr 2009 01:54:15 -0700
From: Ondrej Certik <ondrej at certik.cz>
Subject: Re: [Numpy-discussion] DVCS at PyCon
To: Discussion of Numerical Python <numpy-discussion at scipy.org>
Message-ID:
	<85b5c3130904100154l797d062fn90ea9389bc2da0b at mail.gmail.com>
Content-Type: text/plain; charset=UTF-8

On Fri, Apr 10, 2009 at 1:07 AM, David Cournapeau
<david at ar.media.kyoto-u.ac.jp> wrote:
> Eric Firing wrote:
>>
>> This is simply wrong. ?Mercurial uses hard links for cloning a repo that
>> is on the same disk, so it is faster and much more space-efficient than
>> copying the files.
>
> Yes, but maybe Ondrej talks about an older hg version ? Hg could not
> handle NTFS hardlink for some time, but it does now if you have pywin32.
>
> And still, switching between branches is faster in git than in hg, for
> some technical reasons I can't really explain (but can be found on the
> ML archives). But as I said previously, speed is not really an argument
> for me. If hg is fast enough for python, it is obviously fast enough for
> numpy and scipy. As long as it does not takes minutes to merge/review
> the 5 lines difference between two branches as is the case in svn right
> now, I am happier :)
>
>> ?But if you do want named branches in a given repo,
>> you can have that also with hg. ?Granted, it has not always been part of
>> hg, but it is now. ?Same with rebasing and transplanting.
>>
>
> As I understand, and correct me if I am wrong, the problems with named
> branches are:
> ? ?- you can't remove them later, it is in the repository 'forever'
> ? ?- it is not easy to make them publicly available

Plus with git, you can fetch the remote repository with all the
branches and browse them locally in your remote branches, when you are
offline. And merge them with your own branches. In mercurial, it seems
the only way to see what changes are there and which branch and which
commits I want to merge is to use "hg in", but that requires the
internet connection, so it's basically like svn. I don't know if
mercurial has improved in this lately, but at least for me, that's a
major problem with mercurial.

But since python now moved to mercurial too, maybe they will help fix this.
:)

It seems to me like the python distutils vs cmake discussion. I also
prefer the "unpythonic" cmake.

Ondrej


------------------------------

Message: 3
Date: Fri, 10 Apr 2009 17:44:23 +0900
From: David Cournapeau <david at ar.media.kyoto-u.ac.jp>
Subject: Re: [Numpy-discussion] DVCS at PyCon
To: Discussion of Numerical Python <numpy-discussion at scipy.org>
Message-ID: <49DF06E7.9040401 at ar.media.kyoto-u.ac.jp>
Content-Type: text/plain; charset=ISO-8859-1

Eric Firing wrote:
> Speaking to Josef: does tortoise-hg provide a satisfactory windows gui, 
> from your standpoint?
>   

Another solution may be eclipse integration. I don't know if that would
work for Josef, but there is a git plugin for eclipse, and I can at
least clone branches from a remote repository, and work with it.

Is there a hg eclipse plugin ? I am not very knowledgeable about IDE,

David


------------------------------

Message: 4
Date: Fri, 10 Apr 2009 11:13:59 +0200
From: Matthieu Brucher <matthieu.brucher at gmail.com>
Subject: Re: [Numpy-discussion] DVCS at PyCon
To: Discussion of Numerical Python <numpy-discussion at scipy.org>
Message-ID:
	<e76aa17f0904100213q48a7d4aaq8e476cf95bcb6f34 at mail.gmail.com>
Content-Type: text/plain; charset=ISO-8859-1

2009/4/10 David Cournapeau <david at ar.media.kyoto-u.ac.jp>:
> Eric Firing wrote:
>> Speaking to Josef: does tortoise-hg provide a satisfactory windows gui,
>> from your standpoint?
>>
>
> Another solution may be eclipse integration. I don't know if that would
> work for Josef, but there is a git plugin for eclipse, and I can at
> least clone branches from a remote repository, and work with it.
>
> Is there a hg eclipse plugin ? I am not very knowledgeable about IDE,

Yes, there is MercurialEclipse. I don't know how it handles branches.
I use BzrEclipse for my work, and it doesn't handle branches at all,
you have to fall back to the command line.

Matthieu
-- 
Information System Engineer, Ph.D.
Website: http://matthieu-brucher.developpez.com/
Blogs: http://matt.eifelle.com and http://blog.developpez.com/?blog=92
LinkedIn: http://www.linkedin.com/in/matthieubrucher


------------------------------

Message: 5
Date: Fri, 10 Apr 2009 13:14:32 +0200
From: Vincent Thierion <vincent.thierion at ema.fr>
Subject: [Numpy-discussion] numpy import on x86_64 arch
To: Discussion of Numerical Python <numpy-discussion at scipy.org>
Message-ID:
	<fe7702cd0904100414g59e97bd4pebac3fad0a6e110f at mail.gmail.com>
Content-Type: text/plain; charset="iso-8859-1"

Hello,

I import numpy module (numpy-1.0.4) on a x86_64 machine (on which I don't
have any root privileges) after having install it thanks to python "setup.py
install --prefix=../numpy". In this manner, I obtain a 64 bits compatible
numpy library.
(the "numpy" folder used for install is created just before install process)

I had the following error :

*    import numpy
  File "../numpy/lib64/python2.3/site-packages/numpy/__init__.py", line 43,
in ?
  File "../numpy/lib64/python2.3/site-packages/numpy/linalg/__init__.py",
line 4, in ?
  File "../numpy/lib64/python2.3/site-packages/numpy/linalg/linalg.py", line
25, in ?
ImportError: liblapack.so.3: cannot open shared object file: No such file or
directory

*I looked for liblapack.so.3 and find it in /usr/*lib*/liblapack.so.3. It
seems it doesn't work because numpy 64 bits version seems to need /usr/*
lib64*/liblapack.so.3. That I tested in other machine and it works when
numpy can find /usr/*lib64*/liblapack.so.3.

So is there a solution to use numpy 64 bits version with a
/usr/*lib*/liblapack.so.3
or is it absolutly necessary to install lapack 64 bits version ?

Thank you

Vincent
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
http://mail.scipy.org/pipermail/numpy-discussion/attachments/20090410/98240a
bd/attachment-0001.html 

------------------------------

Message: 6
Date: Fri, 10 Apr 2009 20:02:13 +0900
From: David Cournapeau <david at ar.media.kyoto-u.ac.jp>
Subject: Re: [Numpy-discussion] numpy import on x86_64 arch
To: Discussion of Numerical Python <numpy-discussion at scipy.org>
Message-ID: <49DF2735.4080708 at ar.media.kyoto-u.ac.jp>
Content-Type: text/plain; charset=ISO-8859-1

Vincent Thierion wrote:
> Hello,
>
> I import numpy module (numpy-1.0.4) on a x86_64 machine (on which I
> don't have any root privileges) after having install it thanks to
> python "setup.py install --prefix=../numpy". In this manner, I obtain
> a 64 bits compatible numpy library.
> (the "numpy" folder used for install is created just before install
> process)

You should really use a newer version if you install from sources. numpy
1.0.4 is one year and a half old. If you have to use pytyhon 2.3, you
should use numpy 1.1.*. If you have python 2.4 or later, you should use
numpy 1.3.0.

>
> I had the following error :
>
> /    import numpy
>   File "../numpy/lib64/python2.3/site-packages/numpy/__init__.py",
> line 43, in ?
>   File
> "../numpy/lib64/python2.3/site-packages/numpy/linalg/__init__.py",
> line 4, in ?
>   File
> "../numpy/lib64/python2.3/site-packages/numpy/linalg/linalg.py", line
> 25, in ?
> ImportError: liblapack.so.3: cannot open shared object file: No such
> file or directory
> /

What does ldd
../numpy/lib64/python2.3/site-packages/numpy/linalg/lapack_lite.so say ?

> So is there a solution to use numpy 64 bits version with a
> /usr/*lib*/liblapack.so.3 or is it absolutly necessary to install
> lapack 64 bits version ?

It is certainly mandatory to have a 64 bits version of lapack when
building numpy for a 64 bits python, but it does not have to be in
/usr/lib64. You can also build numpy without lapack:

LAPACK=None python setup.py ....

cheers,

David


------------------------------

Message: 7
Date: Fri, 10 Apr 2009 15:18:52 +0100
From: "Frank Peacock" <frank at gis4weather.com>
Subject: [Numpy-discussion] Replacing colours in numpy array
To: <numpy-discussion at scipy.org>
Message-ID: <003001c9b9e7$4660c220$d3224660$@com>
Content-Type: text/plain;	charset="us-ascii"

Hello

I have a numpy array that I obtained from a converted RGBA gif image. I have
tried to replace some colours with different ones using the where condition
but have a problem with dimensions.

If I use b=where(a==0,255,a) where a is a numpy array from an image it does
replace components of the RGB values in each pixel but it fails if I try
b=where(a==(0,0,0,255),(255,255,255,255),a) with incorrect dimension error.

Could you please help?

Thanks

Frank

-----Original Message-----
From: numpy-discussion-bounces at scipy.org
[mailto:numpy-discussion-bounces at scipy.org] On Behalf Of
numpy-discussion-request at scipy.org
Sent: 10 April 2009 08:18
To: numpy-discussion at scipy.org
Subject: Numpy-discussion Digest, Vol 31, Issue 24

Send Numpy-discussion mailing list submissions to
	numpy-discussion at scipy.org

To subscribe or unsubscribe via the World Wide Web, visit
	http://mail.scipy.org/mailman/listinfo/numpy-discussion
or, via email, send a message with subject or body 'help' to
	numpy-discussion-request at scipy.org

You can reach the person managing the list at
	numpy-discussion-owner at scipy.org

When replying, please edit your Subject line so it is more specific
than "Re: Contents of Numpy-discussion digest..."


Today's Topics:

   1. Re: DVCS at PyCon (Ondrej Certik)
   2. Re: using reducing functions without	eliminating	dimensions?
      (Dan Lenski)
   3. Re: Another Array (Ian Mallett)
   4. Re: Another Array (Robert Kern)
   5. Re: Another Array (Ian Mallett)
   6. Re: Another Array (Robert Kern)
   7. Re: Another Array (Ian Mallett)
   8. Re: Another Array (Anne Archibald)


----------------------------------------------------------------------

Message: 1
Date: Thu, 9 Apr 2009 23:14:53 -0700
From: Ondrej Certik <ondrej at certik.cz>
Subject: Re: [Numpy-discussion] DVCS at PyCon
To: Discussion of Numerical Python <numpy-discussion at scipy.org>
Message-ID:
	<85b5c3130904092314o7512cdbfjd2f43b98e65f48cf at mail.gmail.com>
Content-Type: text/plain; charset=UTF-8

On Thu, Apr 9, 2009 at 10:45 PM, David Cournapeau
<david at ar.media.kyoto-u.ac.jp> wrote:
> Ondrej Certik wrote:
>>
>> It is maybe easier to learn how to work with different clones, but
>> once you start working with lots of patches and you need to reclone
>> all the time, then it's the wrong approach to work, as it takes lots
>> of time to copy the whole repository on the disk.
>
> Yes, *I* know how to use git, and I agree with you, I vastly prefer git
> branch handling to bzr branch handling. *I* find working with GUI for
> VCS a real PITA. But I am not the only numpy developer, that's why the
> feedback from people like Josef with a totally different workflow than
> me is valuable - much more than people like us who are unix geeks :)

Yes, definitely.

Ondrej


------------------------------

Message: 2
Date: Fri, 10 Apr 2009 06:36:08 +0000 (UTC)
From: Dan Lenski <dlenski at gmail.com>
Subject: Re: [Numpy-discussion] using reducing functions without
	eliminating	dimensions?
To: numpy-discussion at scipy.org
Message-ID: <grmpcn$2j4$1 at ger.gmane.org>
Content-Type: text/plain; charset=UTF-8

On Thu, 09 Apr 2009 01:31:33 -0500, Robert Kern wrote:

> On Thu, Apr 9, 2009 at 01:29, Anne Archibald <peridot.faceted at gmail.com>
> wrote:
> 
>> What's wrong with np.amin(a,axis=-1)[...,np.newaxis]?
> 
> It's cumbersome, particularly when you have axis=arbitrary_axis.

Quite right.  It would nice to be able to say:

np.amin(a, axiskeep=-1)
or
a.min(axiskeep=3)

... or something along those lines.

Dan



------------------------------

Message: 3
Date: Thu, 9 Apr 2009 23:42:03 -0700
From: Ian Mallett <geometrian at gmail.com>
Subject: Re: [Numpy-discussion] Another Array
To: Discussion of Numerical Python <numpy-discussion at scipy.org>
Message-ID:
	<a62fab400904092342y5509976bk927277d72730a095 at mail.gmail.com>
Content-Type: text/plain; charset="iso-8859-1"

It gives a perfect parabolic shape that looks very nice, but somewhat
unrealistic.  I'd like to scale the unit vectors by a random length (which
can just be a uniform distribution).  I tried scaling the unit vector n*n*3
array by a random n*n array, but that didn't work, obviously.  Help?
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
http://mail.scipy.org/pipermail/numpy-discussion/attachments/20090409/e8fea8
42/attachment-0001.html 

------------------------------

Message: 4
Date: Fri, 10 Apr 2009 01:46:47 -0500
From: Robert Kern <robert.kern at gmail.com>
Subject: Re: [Numpy-discussion] Another Array
To: Discussion of Numerical Python <numpy-discussion at scipy.org>
Message-ID:
	<3d375d730904092346l6850c03chca1962915263e79c at mail.gmail.com>
Content-Type: text/plain; charset=UTF-8

On Fri, Apr 10, 2009 at 01:42, Ian Mallett <geometrian at gmail.com> wrote:
> It gives a perfect parabolic shape that looks very nice, but somewhat
> unrealistic.

Parabolic? They should be spherical.

> I'd like to scale the unit vectors by a random length (which
> can just be a uniform distribution).? I tried scaling the unit vector
n*n*3
> array by a random n*n array, but that didn't work, obviously.

No, it's not obvious. Exactly what code did you try? What results did
you get? What results were you expecting?

> Help?

Let's take a step back. What kind of distribution are you trying to
achieve? You asked for uniformly distributed unit vectors. Now you are
asking for something else, but I'm really not sure what. What standard
are you comparing against when you say that the unit vectors look
"unrealistic"?

-- 
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless
enigma that is made terrible by our own mad attempt to interpret it as
though it had an underlying truth."
  -- Umberto Eco


------------------------------

Message: 5
Date: Thu, 9 Apr 2009 23:58:21 -0700
From: Ian Mallett <geometrian at gmail.com>
Subject: Re: [Numpy-discussion] Another Array
To: Discussion of Numerical Python <numpy-discussion at scipy.org>
Message-ID:
	<a62fab400904092358m60233289u73aba89a6bdad4b2 at mail.gmail.com>
Content-Type: text/plain; charset="iso-8859-1"

On Thu, Apr 9, 2009 at 11:46 PM, Robert Kern <robert.kern at gmail.com> wrote:

> Parabolic? They should be spherical.

The particle system in the last screenshot was affected by gravity.  In the
absence of gravity, the results should be spherical, yes.  All the vectors
are a unit length, which produces a perfectly smooth surface (unrealistic
for such an effect).

> No, it's not obvious. Exactly what code did you try? What results did
> you get? What results were you expecting?

It crashed.
I have this code:
vecs = Numeric.random.standard_normal(size=(self.size[0],self.size[1],3))
magnitudes = Numeric.sqrt((vecs*vecs).sum(axis=-1))
uvecs = vecs / magnitudes[...,Numeric.newaxis]
randlen = Numeric.random.random((self.size[0],self.size[1]))
randuvecs = uvecs*randlen #It crashes here with a dimension mismatch
rgb = ((randvecs+1.0)/2.0)*255.0

I also tried randlen = Numeric.random.random((self.size[0],self.size[1],3)),
but this does not scale each of the vector's components equally, producing
artifacts again.  Each needs to be scaled by the same random value for it to
make sense.

> Let's take a step back. What kind of distribution are you trying to
> achieve? You asked for uniformly distributed unit vectors. Now you are
> asking for something else, but I'm really not sure what. What standard
> are you comparing against when you say that the unit vectors look
> "unrealistic"?

The vectors are used to "jitter" each particle's initial speed, so that the
particles go in different directions instead of moving all as one.  Using
the unit vector causes the particles to make the smooth parabolic shape.
The jitter vectors much then be of a random length, so that the particles go
in all different directions at all different speeds, instead of just all in
different directions.

Ian
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
http://mail.scipy.org/pipermail/numpy-discussion/attachments/20090409/8d6bd9
31/attachment-0001.html 

------------------------------

Message: 6
Date: Fri, 10 Apr 2009 02:01:10 -0500
From: Robert Kern <robert.kern at gmail.com>
Subject: Re: [Numpy-discussion] Another Array
To: Discussion of Numerical Python <numpy-discussion at scipy.org>
Message-ID:
	<3d375d730904100001w4cdb7bedi2b36bca97731ef8c at mail.gmail.com>
Content-Type: text/plain; charset=UTF-8

On Fri, Apr 10, 2009 at 01:58, Ian Mallett <geometrian at gmail.com> wrote:
> On Thu, Apr 9, 2009 at 11:46 PM, Robert Kern <robert.kern at gmail.com>
wrote:
>>
>> Parabolic? They should be spherical.
>
> The particle system in the last screenshot was affected by gravity.? In
the
> absence of gravity, the results should be spherical, yes.? All the vectors
> are a unit length, which produces a perfectly smooth surface (unrealistic
> for such an effect).
>>
>> No, it's not obvious. Exactly what code did you try? What results did
>> you get? What results were you expecting?
>
> It crashed.
> I have this code:
> vecs = Numeric.random.standard_normal(size=(self.size[0],self.size[1],3))
> magnitudes = Numeric.sqrt((vecs*vecs).sum(axis=-1))
> uvecs = vecs / magnitudes[...,Numeric.newaxis]
> randlen = Numeric.random.random((self.size[0],self.size[1]))
> randuvecs = uvecs*randlen #It crashes here with a dimension mismatch
> rgb = ((randvecs+1.0)/2.0)*255.0
>
> I also tried randlen =
Numeric.random.random((self.size[0],self.size[1],3)),
> but this does not scale each of the vector's components equally, producing
> artifacts again.? Each needs to be scaled by the same random value for it
to
> make sense.

See how I did magnitudes[...,numpy.newaxis]? You have to do the same.

>> Let's take a step back. What kind of distribution are you trying to
>> achieve? You asked for uniformly distributed unit vectors. Now you are
>> asking for something else, but I'm really not sure what. What standard
>> are you comparing against when you say that the unit vectors look
>> "unrealistic"?
>
> The vectors are used to "jitter" each particle's initial speed, so that
the
> particles go in different directions instead of moving all as one.? Using
> the unit vector causes the particles to make the smooth parabolic shape.
> The jitter vectors much then be of a random length, so that the particles
go
> in all different directions at all different speeds, instead of just all
in
> different directions.

Ah, okay. That makes sense.

-- 
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless
enigma that is made terrible by our own mad attempt to interpret it as
though it had an underlying truth."
  -- Umberto Eco


------------------------------

Message: 7
Date: Fri, 10 Apr 2009 00:07:45 -0700
From: Ian Mallett <geometrian at gmail.com>
Subject: Re: [Numpy-discussion] Another Array
To: Discussion of Numerical Python <numpy-discussion at scipy.org>
Message-ID:
	<a62fab400904100007y185a80d5jfa4d62a850d1245 at mail.gmail.com>
Content-Type: text/plain; charset="iso-8859-1"

This seems to work:

vecs = Numeric.random.standard_normal(size=(self.size[0],self.size[1],3))
magnitudes = Numeric.sqrt((vecs*vecs).sum(axis=-1))
uvecs = vecs / magnitudes[...,Numeric.newaxis]
randlen = Numeric.random.random((self.size[0],self.size[1]))
randuvecs = uvecs*randlen[...,Numeric.newaxis]
rgb = ((randuvecs+1.0)/2.0)*255.0

(I have "import numpy as Numeric" for other reasons, that's why there's
Numeric there).

Thanks,
Ian
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
http://mail.scipy.org/pipermail/numpy-discussion/attachments/20090410/84e6aa
a1/attachment-0001.html 

------------------------------

Message: 8
Date: Fri, 10 Apr 2009 03:17:58 -0400
From: Anne Archibald <peridot.faceted at gmail.com>
Subject: Re: [Numpy-discussion] Another Array
To: Discussion of Numerical Python <numpy-discussion at scipy.org>
Message-ID:
	<ce557a360904100017l45019b89t32ce233226c4fae0 at mail.gmail.com>
Content-Type: text/plain; charset=UTF-8

2009/4/10 Ian Mallett <geometrian at gmail.com>:

> The vectors are used to "jitter" each particle's initial speed, so that
the
> particles go in different directions instead of moving all as one.? Using
> the unit vector causes the particles to make the smooth parabolic shape.
> The jitter vectors much then be of a random length, so that the particles
go
> in all different directions at all different speeds, instead of just all
in
> different directions.

Why not just skip the normalization? Then you'll get vectors with
random direction and a natural distribution of lengths. And it'll be
faster than tne unit vectors...

Anne


------------------------------

_______________________________________________
Numpy-discussion mailing list
Numpy-discussion at scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


End of Numpy-discussion Digest, Vol 31, Issue 24
************************************************




------------------------------

_______________________________________________
Numpy-discussion mailing list
Numpy-discussion at scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


End of Numpy-discussion Digest, Vol 31, Issue 26
************************************************





More information about the NumPy-Discussion mailing list