Enormous Input and Output Test

nn pruebauno at latinmail.com
Mon Oct 5 12:08:59 EDT 2009


On Oct 4, 8:41 am, Duncan Booth <duncan.bo... at invalid.invalid> wrote:
> Jon Clements <jon... at googlemail.com> wrote:
> > On Oct 4, 12:08 pm, n00m <n... at narod.ru> wrote:
> >> Duncan Booth,
>
> >> alas... still TLE:
>
> >> 2800839
> >> 2009-10-04 13:03:59
> >> Q
> >> Enormous Input and Output Test
> >> time limit exceeded
> >> -
> >> 88M
> >> PYTH
>
> > Just to throw into the mix...
>
> > What about buffering? Does anyone know what the effective stdin buffer
> > is for Python? I mean, it really can't be the multiplying that's a
> > bottleneck. Not sure if it's possible, but can a new stdin be created
> > (possibly using os.fdopen) with a hefty buffer size?
>
> > I'm probably way off, but something to share.
>
> I did try a version where I just read the data in, split it up, and then
> wrote it out again. On my test file that took about 2 seconds compared with
> the 8 seconds it took the full code I posted, so while there may be scope
> for faster I/O (e.g. using mmap), any real speedup would have to be in the
> convert to int, multiply, convert back to str pipeline.

This takes 5 second on my machine using a file with 1,000,000 random
entries:

inf=open('tstf')
outf=open('tstof','w')
for i in xrange(int(inf.next(),10)):
    n1,s,n2=inf.next().partition(' ')
    print >>outf,int(n1,10)*int(n2,10)
outf.close()
inf.close()



More information about the Python-list mailing list