memory management
Sudheer Joseph
sjo.india at gmail.com
Mon Feb 18 20:13:32 EST 2013
> Python version and OS please. And is the Python 32bit or 64bit? How
>
> much RAM does the computer have, and how big are the swapfiles ?
>
Python 2.7.3
ubuntu 12.04 64 bit
4GB RAM
>
> "Fairly big" is fairly vague. To some people, a list with 100k members
>
> is huge, but not to a modern computer.
I have a data loaded to memory from netcdf file which is 2091*140*180 grid points (2091 time, 140 latitude 180 longitude) apart from this I define a 2 3d arrays r3d and lags3d to store the output for writing out to netcdf file after completion.
>
>
> How have you checked whether it's running out of memory? Have you run
>
> 'top' on it? Or is that just a guess?
I have not done this but the speed (assessed from the listing of grid i and j) get stopped after j=6 ie after running 6 longitude grids)
>
Will check the top as you suggested
Here is the result of top it used about 3gB memory
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
3069 sjo 20 0 3636m 3.0g 2504 D 3 78.7 3:07.44 python
>
> I haven't used numpy, scipy, nor matplotlib, and it's been a long time
>
> since I did correlations. But are you sure you're not just implementing
>
> an O(n**3) algorithm or something, and it's just extremely slow?
>
Correlation do not involve such computation normally, I am not sure if internally python does some thing like that.
with best regards,
Sudheer
>
>
>
> > from mpl_toolkits.basemap import Basemap as bm, shiftgrid, cm
>
> > import numpy as np
>
> > import matplotlib.pyplot as plt
>
> > from netCDF4 import Dataset
>
> > from math import pow, sqrt
>
> > import sys
>
> > from scipy.stats import t
>
>
>
> <snip>
>
>
>
> --
>
> DaveA
More information about the Python-list
mailing list