memory management
Dave Angel
davea at davea.name
Mon Feb 18 19:10:55 EST 2013
On 02/18/2013 10:29 AM, Sudheer Joseph wrote:
> HI,
> I have been trying to compute cross correlation between a time series at a location f(1) and the timeseries of spatial data f(XYT) and saving the resulting correlation coefficients and lags in a 3 dimensional array which is of fairly big size. Though the code I made for this purpose works up to few iterations then it hangs due to apparent memory crunch. Can anybody suggest a better way to handle this situation so that the computation and data storing can be done with out hangups. Finally I intend to save the data as netcdf file which is not implemented as of now. Below is the piece of code I wrote for this purpose.
>
Python version and OS please. And is the Python 32bit or 64bit? How
much RAM does the computer have, and how big are the swapfiles ?
"Fairly big" is fairly vague. To some people, a list with 100k members
is huge, but not to a modern computer.
How have you checked whether it's running out of memory? Have you run
'top' on it? Or is that just a guess?
I haven't used numpy, scipy, nor matplotlib, and it's been a long time
since I did correlations. But are you sure you're not just implementing
an O(n**3) algorithm or something, and it's just extremely slow?
> from mpl_toolkits.basemap import Basemap as bm, shiftgrid, cm
> import numpy as np
> import matplotlib.pyplot as plt
> from netCDF4 import Dataset
> from math import pow, sqrt
> import sys
> from scipy.stats import t
<snip>
--
DaveA
More information about the Python-list
mailing list