simple httplib and urllib timeout question

musingattheruins at musingattheruins at
Thu Apr 13 12:31:49 EDT 2000

> > Just a quick question, im writing a small webbot for a python
> > learning experience and it comes across some sites that will make it
> > hang for a while, if the site is down or extremely slow etc, is
> > there a way to set a faster timeout for httplib . Or is there a
> > better way to handle it ?

I had the same problem and would get back exceptions that said a
timeout had occurred.  I used a retry as a work-around....

class Page:
	"Page: class to get a page from the web."
	def get(self, host, page):
			p = Page()
			text = p.get(host, page)
		for i in range(3):
				import httplib
				http = httplib.HTTP(host)
				http.putrequest("GET", page)
				http.putheader("Accept", "text/html")
				http.putheader("Accept", "text/plain")
				errcode,errmsg,headers = http.getreply()
				fp = http.getfile()
				sData = or ""
				del http
				if len(sData)>0:
					return sData

which will retry up to three times if the returned data is empty or if
an exception is raised.

L8R :-)

Sent via
Before you buy.

More information about the Python-list mailing list