Python "compiler" is too slow for processing large data files???
Ian Holmes
ianholmes01 at lycos.com
Fri Sep 6 08:10:00 EDT 2002
"Ron Horn" <rchii at lycos.com> wrote in message news:<Fpub9.22728$Ic7.1819173 at news2.west.cox.net>...
> test on my Windows 2000 machine, 1.5GHz, Pentium 4:
>
> C:\test>python compiletest.py 8000 10 comp.py
> C:\test>timer r "python -c \"import comp\""
> 8000
> Time passed: 00:00:09.39
>
> C:\test>python compiletest.py 8000 20 comp2.py
> C:\test>timer r "python -c \"import comp2\""
> 8000
> Time passed: 00:00:26.47
What else is your p4 running? - it seems to running a tad slow
on my celeron 500Mhz win2k 384meg py 2.2.1
runs at 15-17s for 8000x20 and python never got above 2meg (though cpu
was at 80-98% through out)
Also I imported psyco and proxied the main method which reduced the
time to ~6s
see below. psyco is available from psyco.sourceforge.net - I imagine
that if the other functions are proxied to then it may speed up some
more.
#modified with psyco and in script timing
# gen.py
# create a script to generate modules that will stress
# out the python compiler a bit?
import sys, random, time, psyco
def GenLine(numArgs, outfile):
outfile.write(" (")
for i in range(numArgs):
outfile.write("%d, " % random.randint(0,1000))
outfile.write("),\n")
def Generate(numLines, numArgs, outfile):
outfile.write("list1 = [\n")
for i in range(numLines):
GenLine(numArgs, outfile)
outfile.write("]\n")
outfile.write("print len(list1)\n")
def main():
outfile = None
try:
if len(sys.argv) != 4:
print "Usage: gen <lines> <args> <outfile>"
return
numLines = int(sys.argv[1])
numArgs = int(sys.argv[2])
outfile = open(sys.argv[3], 'wt')
Generate(numLines, numArgs, outfile)
finally:
if outfile: outfile.close()
if __name__ == "__main__":
START_TIME = time.time()
mainx = psyco.proxy(main)
mainx()
print "end",time.time() - START_TIME
More information about the Python-list
mailing list