bpo-43650: Fix MemoryError on zip.read in shutil._unpack_zipfile for large files (GH-25058) (GH-26190)

`shutil.unpack_archive()` tries to read the whole file into memory, making no use of any kind of smaller buffer. Process crashes for really large files: I.e. archive: ~1.7G, unpacked: ~10G. Before the crash it can easily take away all available RAM on smaller systems. Had to pull the code form `zipfile.Zipfile.extractall()` to fix this

Automerge-Triggered-By: GH:gpshead
(cherry picked from commit f32c7950e0)

Co-authored-by: Igor Bolshakov <ibolsch@gmail.com>
This commit is contained in:
Miss Islington (bot) 2021-05-17 10:35:30 -07:00 committed by GitHub
parent 60fa8b32db
commit 7a588621c2
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
2 changed files with 8 additions and 10 deletions

View file

@ -1161,20 +1161,16 @@ def _unpack_zipfile(filename, extract_dir):
if name.startswith('/') or '..' in name:
continue
target = os.path.join(extract_dir, *name.split('/'))
if not target:
targetpath = os.path.join(extract_dir, *name.split('/'))
if not targetpath:
continue
_ensure_directory(target)
_ensure_directory(targetpath)
if not name.endswith('/'):
# file
data = zip.read(info.filename)
f = open(target, 'wb')
try:
f.write(data)
finally:
f.close()
del data
with zip.open(name, 'r') as source, \
open(targetpath, 'wb') as target:
copyfileobj(source, target)
finally:
zip.close()