Python "compiler" is too slow for processing large data files???
ianholmes01 at lycos.com
Fri Sep 6 08:10:00 EDT 2002
"Ron Horn" <rchii at lycos.com> wrote in message news:<Fpub9.22728$Ic7.1819173 at news2.west.cox.net>...
> test on my Windows 2000 machine, 1.5GHz, Pentium 4:
> C:\test>python compiletest.py 8000 10 comp.py
> C:\test>timer r "python -c \"import comp\""
> Time passed: 00:00:09.39
> C:\test>python compiletest.py 8000 20 comp2.py
> C:\test>timer r "python -c \"import comp2\""
> Time passed: 00:00:26.47
What else is your p4 running? - it seems to running a tad slow
on my celeron 500Mhz win2k 384meg py 2.2.1
runs at 15-17s for 8000x20 and python never got above 2meg (though cpu
was at 80-98% through out)
Also I imported psyco and proxied the main method which reduced the
time to ~6s
see below. psyco is available from psyco.sourceforge.net - I imagine
that if the other functions are proxied to then it may speed up some
#modified with psyco and in script timing
# create a script to generate modules that will stress
# out the python compiler a bit?
import sys, random, time, psyco
def GenLine(numArgs, outfile):
for i in range(numArgs):
outfile.write("%d, " % random.randint(0,1000))
def Generate(numLines, numArgs, outfile):
outfile.write("list1 = [\n")
for i in range(numLines):
outfile = None
if len(sys.argv) != 4:
print "Usage: gen <lines> <args> <outfile>"
numLines = int(sys.argv)
numArgs = int(sys.argv)
outfile = open(sys.argv, 'wt')
Generate(numLines, numArgs, outfile)
if outfile: outfile.close()
if __name__ == "__main__":
START_TIME = time.time()
mainx = psyco.proxy(main)
print "end",time.time() - START_TIME
More information about the Python-list