Process "Killed"
Glenn Hutchings
zondo42 at googlemail.com
Fri Aug 29 15:12:09 EDT 2008
dieter <vel.accel at gmail.com> writes:
> I'm doing some simple file manipulation work and the process gets
> "Killed" everytime I run it. No traceback, no segfault... just the
> word "Killed" in the bash shell and the process ends. The first few
> batch runs would only succeed with one or two files being processed
> (out of 60) before the process was "Killed". Now it makes no
> successful progress at all. Just a little processing then "Killed".
>
> Any Ideas? Is there a buffer limitation? Do you think it could be the
> filesystem?
> Any suggestions appreciated.... Thanks.
>
> The code I'm running:
> ==================
>
> from glob import glob
>
> def manipFiles():
> filePathList = glob('/data/ascii/*.dat')
> for filePath in filePathList:
> f = open(filePath, 'r')
> lines = f.readlines()[2:]
> f.close()
> f = open(filePath, 'w')
> f.writelines(lines)
> f.close()
> print file
Have you checked memory usage while your program is running? Your
lines = f.readlines()[2:]
statement will need almost twice the memory of your largest file. This
might be a problem, depending on your RAM and what else is running at the
same time.
If you want to reduce memory usage to almost zero, try reading lines from
the file and writing all but the first two to a temporary file, then
renaming the temp file to the original:
import os
infile = open(filePath, 'r')
outfile = open(filePath + '.bak', 'w')
for num, line in enumerate(infile):
if num >= 2:
outfile.write(line)
infile.close()
outfile.close()
os.rename(filePath + '.bak', filePath)
Glenn
More information about the Python-list
mailing list