MemoryError for large sourcecode

Andrew MacIntyre andymac at bullseye.apana.org.au
Mon Oct 7 03:13:44 CEST 2002


[posted & mailed]

On Sun, 6 Oct 2002, Gerson Kurz wrote:

> As a followup, you can use this code to generate a testfile that fails
> on my machine with MemoryError:
>
> t = open("huge.py","w")
> print >>t, "B = {}"
> for i in range(50000):
>     print >>t, "B[%d]=[0]*10" % (i)
> t.close()
>
> The maximum ever allocated by python.exe is around 80m in this case.
> The following allocation works fine:
>
> onemeg = 1024*1024
> bigchunk = "*" * onemeg * 800
>
> and allocates 800 m, and
>
> del bigchunk
>
> frees it again...

While Windows 2000 is better than earlier releases of Windows, it still
has bogons in its memory management (to do with heap fragmentation among
other things I gather).

Tim Peters checked in a change to the parser which ameliorates this sort
of problem significantly - Tim's change will be in 2.2.2.

CVS (2.3 to be) has additional changes which further increase Python's
robustness in the face of platform memory management issues.

Other platforms have been affected by similar issues too.

--
Andrew I MacIntyre                     "These thoughts are mine alone..."
E-mail: andymac at bullseye.apana.org.au  | Snail: PO Box 370
        andymac at pcug.org.au            |        Belconnen  ACT  2616
Web:    http://www.andymac.org/        |        Australia





More information about the Python-list mailing list