[Python-checkins] bpo-43650: Fix MemoryError on zip.read in shutil._unpack_zipfile for large files (GH-25058) (GH-26190)

gpshead webhook-mailer at python.org
Mon May 17 13:35:34 EDT 2021


https://github.com/python/cpython/commit/7a588621c2854bcef6ce9eeb54349b84ac078c45
commit: 7a588621c2854bcef6ce9eeb54349b84ac078c45
branch: 3.10
author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com>
committer: gpshead <greg at krypto.org>
date: 2021-05-17T10:35:30-07:00
summary:

bpo-43650: Fix MemoryError on zip.read in shutil._unpack_zipfile for large files (GH-25058) (GH-26190)

`shutil.unpack_archive()` tries to read the whole file into memory, making no use of any kind of smaller buffer. Process crashes for really large files: I.e. archive: ~1.7G, unpacked: ~10G. Before the crash it can easily take away all available RAM on smaller systems. Had to pull the code form `zipfile.Zipfile.extractall()` to fix this

Automerge-Triggered-By: GH:gpshead
(cherry picked from commit f32c7950e0077b6d9a8e217c2796fc582f18ca08)

Co-authored-by: Igor Bolshakov <ibolsch at gmail.com>

files:
A Misc/NEWS.d/next/Library/2021-03-29-00-23-30.bpo-43650.v01tic.rst
M Lib/shutil.py

diff --git a/Lib/shutil.py b/Lib/shutil.py
index e29fe4d83e927..33f2ca94bf4bc 100644
--- a/Lib/shutil.py
+++ b/Lib/shutil.py
@@ -1161,20 +1161,16 @@ def _unpack_zipfile(filename, extract_dir):
             if name.startswith('/') or '..' in name:
                 continue
 
-            target = os.path.join(extract_dir, *name.split('/'))
-            if not target:
+            targetpath = os.path.join(extract_dir, *name.split('/'))
+            if not targetpath:
                 continue
 
-            _ensure_directory(target)
+            _ensure_directory(targetpath)
             if not name.endswith('/'):
                 # file
-                data = zip.read(info.filename)
-                f = open(target, 'wb')
-                try:
-                    f.write(data)
-                finally:
-                    f.close()
-                    del data
+                with zip.open(name, 'r') as source, \
+                        open(targetpath, 'wb') as target:
+                    copyfileobj(source, target)
     finally:
         zip.close()
 
diff --git a/Misc/NEWS.d/next/Library/2021-03-29-00-23-30.bpo-43650.v01tic.rst b/Misc/NEWS.d/next/Library/2021-03-29-00-23-30.bpo-43650.v01tic.rst
new file mode 100644
index 0000000000000..a2ea4a4800a73
--- /dev/null
+++ b/Misc/NEWS.d/next/Library/2021-03-29-00-23-30.bpo-43650.v01tic.rst
@@ -0,0 +1,2 @@
+Fix :exc:`MemoryError` in :func:`shutil.unpack_archive` which fails inside
+:func:`shutil._unpack_zipfile` on large files. Patch by Igor Bolshakov.



More information about the Python-checkins mailing list