urllib hangs

Bernd Kaiser meldron at meldron.org
Wed Aug 25 11:53:06 CEST 2004


I also would use a few Threads, they will speed up your script.

Jay Donnell wrote:
> This is a basic version of my code
> 
> for url in urls:
>     fp = urllib.urlopen(url)
>     lines = fp.readlines()
> 
>     #print lines
>     for line in lines:
>         #print line
>         if(reUrl.search(line)):
>             print 'found'
>             return 1
>         else:
>             print 'not found'
>             return 0
> 
> this hangs occasionally for some certain url's. If I do a ctrl-c
> (linux) it will move on to the next url. How can I get this to timeout
> and move on to the next url.



More information about the Python-list mailing list