urrlib2 multithreading error
viscanti at gmail.com
viscanti at gmail.com
Tue Jan 16 09:41:51 EST 2007
Hi,
I'm using urllib2 to retrieve some data usign http in a multithreaded
application.
Here's a piece of code:
req = urllib2.Request(url, txdata, txheaders)
opener = urllib2.build_opener()
opener.addheaders = [('User-agent', user_agent)]
request = opener.open(req)
data = request.read(1024)
I'm trying to read only the first 1024 bytes to retrieve http headers
(if is html then I will retrieve the entire page).
When I use it on a single thread everything goes ok, when I create
multiple threads the execution halts and the program terminates, just
before the last line (when I execute the request.read(.) ). Obviously I
tried to catch the exception but it doesn't work, the interpreter exits
without any exception or message.
How can I solve this?
lv
More information about the Python-list
mailing list