Process "Killed"

dieter vel.accel at gmail.com
Thu Aug 28 01:30:31 EDT 2008


Hi,

Overview
=======

I'm doing some simple file manipulation work and the process gets
"Killed" everytime I run it. No traceback, no segfault... just the
word "Killed" in the bash shell and the process ends. The first few
batch runs would only succeed with one or two files being processed
(out of 60) before the process was "Killed". Now it makes no
successful progress at all. Just a little processing then "Killed".


Question
=======

Any Ideas? Is there a buffer limitation? Do you think it could be the
filesystem?
Any suggestions appreciated.... Thanks.


The code I'm running:
==================

from glob import glob

def manipFiles():
    filePathList = glob('/data/ascii/*.dat')
    for filePath in filePathList:
        f = open(filePath, 'r')
        lines = f.readlines()[2:]
        f.close()
        f = open(filePath, 'w')
        f.writelines(lines)
        f.close()
        print file


Sample lines in File:
================

# time, ap, bp, as, bs, price, vol, size, seq, isUpLast, isUpVol,
isCancel

1062993789 0 0 0 0 1022.75 1 1 0 1 0 0
1073883668 1120 1119.75 28 33 0 0 0 0 0 0 0


Other Info
========

- The file sizes range from 76 Kb to 146 Mb
- I'm running on a Gentoo Linux OS
- The filesystem is partitioned and using: XFS for the data
repository, Reiser3 for all else.



More information about the Python-list mailing list