Re: [Numpy-discussion] Is there a pure numpy recipe for this?

Thanks for all of the suggestions; we are migrating to 64bit Python soon as well. The environments are Win7 and Mac Maverics. carray sounds like what you said Chris - more I just found at http://kmike.ru/python-data-structures/ - Ray Schumacher At 12:31 PM 3/27/2014, you wrote:
On Thu, Mar 27, 2014 at 7:42 AM, RayS <<mailto:rays@blue-cove.com>rays@blue-cove.com> wrote: I find this interesting, since I work with medical data sets of 100s of MB, and regularly run into memory allocation problems when doing a lot of Fourrier analysis, waterfalls etc. The per-process limit seems to be about 1.3GB on this 6GB quad-i7 with Win7.
This sounds like 32 bit -- have you tried a 64 bit Python_numpy? Nt that you wont have issues anyway, but you should be abel to do better than 1.3GB... Â Â memmaps are also limited to RAM,
I don't think so, no -- but are limited to 2GB (I think) Â if you're using a 32 bit process
There is also a compressed array package out there -- I can't remember what it's called -- but if you have large compressible arrays -- that might help.  -CHB
--
Christopher Barker, Ph.D. Oceanographer
Emergency Response Division NOAA/NOS/OR&R Â Â Â Â Â Â (206) 526-6959Â Â voice 7600 Sand Point Way NE Â Â (206) 526-6329Â Â fax Seattle, WA Â 98115 Â Â Â Â (206) 526-6317Â Â main reception
<mailto:Chris.Barker@noaa.gov>Chris.Barker@noaa.gov _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion

Id recommend taking a look at pytables as well. It has support for out-of-core array computations on large arrays. On Thu, Mar 27, 2014 at 9:00 PM, RayS <rays@blue-cove.com> wrote:
Thanks for all of the suggestions; we are migrating to 64bit Python soon as well. The environments are Win7 and Mac Maverics. carray sounds like what you said Chris - more I just found at http://kmike.ru/python-data-structures/
- Ray Schumacher
At 12:31 PM 3/27/2014, you wrote:
On Thu, Mar 27, 2014 at 7:42 AM, RayS <rays@blue-cove.com> wrote: I find this interesting, since I work with medical data sets of 100s of MB, and regularly run into memory allocation problems when doing a lot of Fourrier analysis, waterfalls etc. The per-process limit seems to be about 1.3GB on this 6GB quad-i7 with Win7.
This sounds like 32 bit -- have you tried a 64 bit Python_numpy? Nt that you wont have issues anyway, but you should be abel to do better than 1.3GB... Â Â memmaps are also limited to RAM,
I don't think so, no -- but are limited to 2GB (I think) Â if you're using a 32 bit process
There is also a compressed array package out there -- I can't remember what it's called -- but if you have large compressible arrays -- that might help.  -CHB
--
Christopher Barker, Ph.D. Oceanographer
Emergency Response Division NOAA/NOS/OR&R Â Â Â Â Â Â (206) 526-6959Â Â voice 7600 Sand Point Way NE Â Â (206) 526-6329Â Â fax Seattle, WA Â 98115 Â Â Â Â (206) 526-6317Â Â main reception
Chris.Barker@noaa.gov _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
_______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
participants (2)
-
Eelco Hoogendoorn
-
RayS