Downloading huge files via urllib
vlindberg at verio.net
Tue Sep 24 01:21:41 CEST 2002
For various reasons, I have to use https to download large (20+ MB) text
files which I then parse. I set up a basic function to do this using
response = urllib.urlretrieve(serverURL, 'run.log')
However, I then get a MemoryError. Tracking down the source of the
error, I see the offending function in httplib:
def makefile(self, mode, bufsize=None):
"""Return a readable file-like object with data from socket.
This method offers only partial support for the makefile
interface of a real socket. It only supports modes 'r' and
'rb' and the bufsize argument is ignored.
The returned object contains *all* of the file data
I think the problem is that bufsize argument that is ignored. Does
anyone know if this is correct, and what I can do about it? I would
like to automate the process of downloading this file, but is it possible?
More information about the Python-list