md5 and large files

Tim Peters tim.peters at gmail.com
Sun Oct 17 19:34:43 CEST 2004


[Brad Tilley]
> I would like to verify that the files are not corrupt so what's the most
> efficient way to calculate md5 sums on 4GB files? The machine doing the
> calculations is a small desktop with 256MB of RAM.

You'll find md5sum.py in your Python distribution.  It reads 8KB at a
time, so requires little RAM regardless of file size.  You can use it
directly, or copy the small calculation loop into your program.  Note
that reading 4GB of data will take significant time no matter what you
do.



More information about the Python-list mailing list