[Numpy-discussion] reduce array by computing min/max every n samples
Brad Buran
bburan at cns.nyu.edu
Mon Jun 21 15:09:26 EDT 2010
This would certainly be useful in my case as well. I originally tried doing
something similar:
fun = lambda x: (x.min(), x,max())
apply_along_axis(fun, -1, val_pts)
It turned out to be much slower, which I guess isn't too surprising.
Brad
On Sat, Jun 19, 2010 at 4:45 PM, Warren Weckesser <
warren.weckesser at enthought.com> wrote:
> Benjamin Root wrote:
> > Brad, I think you are doing it the right way, but I think what is
> > happening is that the reshape() call on the sliced array is forcing a
> > copy to be made first. The fact that the copy has to be made twice
> > just worsens the issue. I would save a copy of the reshape result (it
> > is usually a view of the original data, unless a copy is forced), and
> > then perform a min/max call on that with the appropriate axis.
> >
> > On that note, would it be a bad idea to have a function that returns a
> > min/max tuple?
>
> +1. More than once I've wanted exactly such a function.
>
> Warren
>
>
> > Performing two iterations to gather the min and the max information
> > versus a single iteration to gather both at the same time would be
> > useful. I should note that there is a numpy.ptp() function that
> > returns the difference between the min and the max, but I don't see
> > anything that returns the actual values.
> >
> > Ben Root
> >
> > On Thu, Jun 17, 2010 at 4:50 PM, Brad Buran <bburan at cns.nyu.edu
> > <mailto:bburan at cns.nyu.edu>> wrote:
> >
> > I have a 1D array with >100k samples that I would like to reduce by
> > computing the min/max of each "chunk" of n samples. Right now, my
> > code is as follows:
> >
> > n = 100
> > offset = array.size % downsample
> > array_min = array[offset:].reshape((-1, n)).min(-1)
> > array_max = array[offset:].reshape((-1, n)).max(-1)
> >
> > However, this appears to be running pretty slowly. The array is data
> > streamed in real-time from external hardware devices and I need to
> > downsample this and compute the min/max for plotting. I'd like to
> > speed this up so that I can plot updates to the data as quickly as
> new
> > data comes in.
> >
> > Are there recommendations for faster ways to perform the
> downsampling?
> >
> > Thanks,
> > Brad
> > _______________________________________________
> > NumPy-Discussion mailing list
> > NumPy-Discussion at scipy.org <mailto:NumPy-Discussion at scipy.org>
> > http://mail.scipy.org/mailman/listinfo/numpy-discussion
> >
> >
> > ------------------------------------------------------------------------
> >
> > _______________________________________________
> > NumPy-Discussion mailing list
> > NumPy-Discussion at scipy.org
> > http://mail.scipy.org/mailman/listinfo/numpy-discussion
> >
>
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion at scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/numpy-discussion/attachments/20100621/1658f650/attachment.html>
More information about the NumPy-Discussion
mailing list