[Tutor] memory error files over 100MB

Moos Heintzen iwasroot at gmail.com
Tue Mar 10 20:33:16 CET 2009

On Tue, Mar 10, 2009 at 8:45 AM, Harris, Sarah L
<sarah.l.harris at jpl.nasa.gov> wrote:
> That looks better, thank you.
> However I still have a memory error when I try to run it on three or more files that are over 100 MB?

How big are files in the zip file?

It seems that in this line


the compressed file is unzipped to memory first, then written to the new file.

You can read and write in smaller chunks using file objects returned
by zf.open(), which take a size parameter. (Maybe it wouldn't work
since the file is going to get extracted to memory anyway.)

However, the open() method is in Python 2.6, and in Python 2.6 there
is also the extractall() method


which does what you're doing. I'm not sure if it will still cause a
memory error.

Also, are files in your zip files not in directories, since you're not
creating directories?

Since this is a learning experience, it might also help creating
functions to minimize clutter, or you could familiarize yourself with
the language.


More information about the Tutor mailing list