Python "compiler" is too slow for processing large data files???

Ian Holmes ianholmes01 at
Fri Sep 6 08:10:00 EDT 2002

"Ron Horn" <rchii at> wrote in message news:<Fpub9.22728$Ic7.1819173 at>...
> test on my Windows 2000 machine, 1.5GHz, Pentium 4:
> C:\test>python 8000 10
> C:\test>timer r "python -c \"import comp\""
> 8000
> Time passed: 00:00:09.39
> C:\test>python 8000 20
> C:\test>timer r "python -c \"import comp2\""
> 8000
> Time passed: 00:00:26.47

What else is your p4 running? - it seems to running a tad slow

on my celeron 500Mhz win2k 384meg py 2.2.1
runs at 15-17s for 8000x20 and python never got above 2meg (though cpu
was at 80-98% through out)

Also I imported psyco and proxied the main method which reduced the
time to ~6s

see below. psyco is available from - I imagine
that if the other functions are proxied to then it may speed up some

#modified with psyco and in script timing
# create a script to generate modules that will stress
# out the python compiler a bit?

import sys, random, time, psyco

def GenLine(numArgs, outfile):
    outfile.write("    (")
    for i in range(numArgs):
        outfile.write("%d, " % random.randint(0,1000))

def Generate(numLines, numArgs, outfile):
    outfile.write("list1 = [\n")
    for i in range(numLines):
        GenLine(numArgs, outfile)
    outfile.write("print len(list1)\n")

def main():
    outfile = None
        if len(sys.argv) != 4:
            print "Usage: gen <lines> <args> <outfile>"

        numLines = int(sys.argv[1])
        numArgs = int(sys.argv[2])
        outfile = open(sys.argv[3], 'wt')

        Generate(numLines, numArgs, outfile)

        if outfile: outfile.close()

if __name__ == "__main__":
    START_TIME = time.time()
    mainx = psyco.proxy(main)
    print "end",time.time() - START_TIME

More information about the Python-list mailing list