Enormous Input and Output Test
duncan.booth at invalid.invalid
Sun Oct 4 12:28:24 CEST 2009
n00m <n00m at narod.ru> wrote:
> I've given up :-)
Here's my attempt, which is about 30% faster than your original but I've no
idea if it would be fast enough for you.
import sys, time, os, itertools
D = 
##sys.stdin = open('D:/1583.txt', 'rt')
count = int(sys.stdin.readline())
data = sys.stdin.read().split()
data = map(int, data)
nextval = iter(data).next
data = map(str, (nextval()*nextval() for a in xrange(count)))
start = time.time()
print >>sys.stderr, time.time() - start
Storing the large lists in a global prevent them deallocating when the
function returns, so speeds up the time as I recorded it. If they are
timing the whole process then I think calling _exit() should avoid the
final cleanup time.
Playing with the garbage collector makes it ever so slightly faster
although disabling it entirely makes it much slower.
I feel there ought to be a faster alternative to xrange but if so I
couldn't find it.
As Terry said, it's all hack tricks.
More information about the Python-list