http request timeout
garth_grimm at hp.com
Tue Oct 2 21:56:38 CEST 2001
While handling a user web request, I'm issuing an HTTP request to another server and parsing the XML
in the response to finish handling the original user's request. I've got the functionality working,
but am now trying to do some exception handling to improve the robustness of the interaction.
I can handle most failure situations (i.e. failed network connection, invalid "XML" returned, etc.)
using try/catch constructs, but the situation I'm currently stuck on is when the remote server
accepts the connection, but for some reason (e.g. process hung?) doesn't send a response. This will
generate an infinite wait and prevent my code from finishing the response to the originally user.
What I'd love to do, is set a time-out on the http connection.
But I can't see any obvious way to do this in the httplib module -- no constructor or method seems
to take a timeout parameter.
Even if I go down to the socket module level (which I'd like to avoid), I still see no built in
I've thought about using a watchdog thread approach, but the threading module documentation states
that "threads cannot be destroyed, stopped, suspended, resumed, or interrupted". This would seem to
me that I could certainly have my user request handling thread finish after the watchdog times out,
but the connection thread to the remote server may continue to run. THis would quickly lead to an
unacceptable number of threads waiting for responses from the remote server.
I would think that dealing with this failure mode would be fairly typical. Have I missed an obvious
built-in solution? If there is none, what approaches have others used to best handle this in Python?
More information about the Python-list