Downloading huge files via urllib
VanL
vlindberg at verio.net
Mon Sep 23 19:21:41 EDT 2002
Hello,
For various reasons, I have to use https to download large (20+ MB) text
files which I then parse. I set up a basic function to do this using
urllib:
response = urllib.urlretrieve(serverURL, 'run.log')
However, I then get a MemoryError. Tracking down the source of the
error, I see the offending function in httplib:
def makefile(self, mode, bufsize=None):
"""Return a readable file-like object with data from socket.
This method offers only partial support for the makefile
interface of a real socket. It only supports modes 'r' and
'rb' and the bufsize argument is ignored.
The returned object contains *all* of the file data
"""
I think the problem is that bufsize argument that is ignored. Does
anyone know if this is correct, and what I can do about it? I would
like to automate the process of downloading this file, but is it possible?
Thanks,
VanL
More information about the Python-list
mailing list